Nov 26 12:11:41 crc systemd[1]: Starting Kubernetes Kubelet... Nov 26 12:11:41 crc restorecon[4633]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 12:11:41 crc restorecon[4633]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 26 12:11:42 crc kubenswrapper[4834]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 12:11:42 crc kubenswrapper[4834]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 26 12:11:42 crc kubenswrapper[4834]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 12:11:42 crc kubenswrapper[4834]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 12:11:42 crc kubenswrapper[4834]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 26 12:11:42 crc kubenswrapper[4834]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.272720 4834 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275086 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275105 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275111 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275116 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275120 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275123 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275127 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275130 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275134 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275148 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275154 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275159 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275164 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275171 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275175 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275179 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275183 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275187 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275191 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275195 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275199 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275202 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275205 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275209 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275213 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275216 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275220 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275224 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275227 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275231 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275234 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275237 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275241 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275245 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275249 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275253 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275257 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275262 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275266 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275271 4834 feature_gate.go:330] unrecognized feature gate: Example Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275275 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275278 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275282 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275285 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275291 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275298 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275302 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275325 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275329 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275333 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275337 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275340 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275344 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275348 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275351 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275355 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275358 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275361 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275364 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275368 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275371 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275374 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275377 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275381 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275385 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275389 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275393 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275396 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275399 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275403 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.275406 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275828 4834 flags.go:64] FLAG: --address="0.0.0.0" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275842 4834 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275851 4834 flags.go:64] FLAG: --anonymous-auth="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275857 4834 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275863 4834 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275867 4834 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275873 4834 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275878 4834 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275883 4834 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275887 4834 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275892 4834 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275897 4834 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275900 4834 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275904 4834 flags.go:64] FLAG: --cgroup-root="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275908 4834 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275911 4834 flags.go:64] FLAG: --client-ca-file="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275915 4834 flags.go:64] FLAG: --cloud-config="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275918 4834 flags.go:64] FLAG: --cloud-provider="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275922 4834 flags.go:64] FLAG: --cluster-dns="[]" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275926 4834 flags.go:64] FLAG: --cluster-domain="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275930 4834 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275933 4834 flags.go:64] FLAG: --config-dir="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275937 4834 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275941 4834 flags.go:64] FLAG: --container-log-max-files="5" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275945 4834 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275949 4834 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275954 4834 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275959 4834 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275963 4834 flags.go:64] FLAG: --contention-profiling="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275966 4834 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275970 4834 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275974 4834 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275977 4834 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275982 4834 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275985 4834 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275989 4834 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275993 4834 flags.go:64] FLAG: --enable-load-reader="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.275997 4834 flags.go:64] FLAG: --enable-server="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276001 4834 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276006 4834 flags.go:64] FLAG: --event-burst="100" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276010 4834 flags.go:64] FLAG: --event-qps="50" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276014 4834 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276018 4834 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276022 4834 flags.go:64] FLAG: --eviction-hard="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276027 4834 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276031 4834 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276035 4834 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276039 4834 flags.go:64] FLAG: --eviction-soft="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276044 4834 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276047 4834 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276051 4834 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276055 4834 flags.go:64] FLAG: --experimental-mounter-path="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276059 4834 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276062 4834 flags.go:64] FLAG: --fail-swap-on="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276066 4834 flags.go:64] FLAG: --feature-gates="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276070 4834 flags.go:64] FLAG: --file-check-frequency="20s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276074 4834 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276078 4834 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276082 4834 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276086 4834 flags.go:64] FLAG: --healthz-port="10248" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276090 4834 flags.go:64] FLAG: --help="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276094 4834 flags.go:64] FLAG: --hostname-override="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276097 4834 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276101 4834 flags.go:64] FLAG: --http-check-frequency="20s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276105 4834 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276109 4834 flags.go:64] FLAG: --image-credential-provider-config="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276112 4834 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276116 4834 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276119 4834 flags.go:64] FLAG: --image-service-endpoint="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276123 4834 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276127 4834 flags.go:64] FLAG: --kube-api-burst="100" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276131 4834 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276135 4834 flags.go:64] FLAG: --kube-api-qps="50" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276139 4834 flags.go:64] FLAG: --kube-reserved="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276143 4834 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276146 4834 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276150 4834 flags.go:64] FLAG: --kubelet-cgroups="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276154 4834 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276159 4834 flags.go:64] FLAG: --lock-file="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276162 4834 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276166 4834 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276169 4834 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276175 4834 flags.go:64] FLAG: --log-json-split-stream="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276179 4834 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276183 4834 flags.go:64] FLAG: --log-text-split-stream="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276187 4834 flags.go:64] FLAG: --logging-format="text" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276191 4834 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276195 4834 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276199 4834 flags.go:64] FLAG: --manifest-url="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276202 4834 flags.go:64] FLAG: --manifest-url-header="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276208 4834 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276212 4834 flags.go:64] FLAG: --max-open-files="1000000" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276216 4834 flags.go:64] FLAG: --max-pods="110" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276220 4834 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276224 4834 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276228 4834 flags.go:64] FLAG: --memory-manager-policy="None" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276232 4834 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276236 4834 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276240 4834 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276244 4834 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276253 4834 flags.go:64] FLAG: --node-status-max-images="50" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276258 4834 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276263 4834 flags.go:64] FLAG: --oom-score-adj="-999" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276268 4834 flags.go:64] FLAG: --pod-cidr="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276273 4834 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276280 4834 flags.go:64] FLAG: --pod-manifest-path="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276283 4834 flags.go:64] FLAG: --pod-max-pids="-1" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276287 4834 flags.go:64] FLAG: --pods-per-core="0" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276291 4834 flags.go:64] FLAG: --port="10250" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276295 4834 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276299 4834 flags.go:64] FLAG: --provider-id="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276302 4834 flags.go:64] FLAG: --qos-reserved="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276322 4834 flags.go:64] FLAG: --read-only-port="10255" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276330 4834 flags.go:64] FLAG: --register-node="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276335 4834 flags.go:64] FLAG: --register-schedulable="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276339 4834 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276345 4834 flags.go:64] FLAG: --registry-burst="10" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276349 4834 flags.go:64] FLAG: --registry-qps="5" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276353 4834 flags.go:64] FLAG: --reserved-cpus="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276358 4834 flags.go:64] FLAG: --reserved-memory="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276363 4834 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276367 4834 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276371 4834 flags.go:64] FLAG: --rotate-certificates="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276376 4834 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276380 4834 flags.go:64] FLAG: --runonce="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276384 4834 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276388 4834 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276392 4834 flags.go:64] FLAG: --seccomp-default="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276396 4834 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276400 4834 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276404 4834 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276408 4834 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276412 4834 flags.go:64] FLAG: --storage-driver-password="root" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276416 4834 flags.go:64] FLAG: --storage-driver-secure="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276420 4834 flags.go:64] FLAG: --storage-driver-table="stats" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276426 4834 flags.go:64] FLAG: --storage-driver-user="root" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276430 4834 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276435 4834 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276439 4834 flags.go:64] FLAG: --system-cgroups="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276443 4834 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276451 4834 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276455 4834 flags.go:64] FLAG: --tls-cert-file="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276459 4834 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276465 4834 flags.go:64] FLAG: --tls-min-version="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276469 4834 flags.go:64] FLAG: --tls-private-key-file="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276475 4834 flags.go:64] FLAG: --topology-manager-policy="none" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276480 4834 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276484 4834 flags.go:64] FLAG: --topology-manager-scope="container" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276489 4834 flags.go:64] FLAG: --v="2" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276494 4834 flags.go:64] FLAG: --version="false" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276499 4834 flags.go:64] FLAG: --vmodule="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276504 4834 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.276508 4834 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277079 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277141 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277149 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277156 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277163 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277172 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277177 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277185 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277194 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277201 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277224 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277235 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277241 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277247 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277252 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277258 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277263 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277269 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277275 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277280 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277285 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277367 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277379 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277388 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277393 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277401 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277406 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277412 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277418 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277424 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277435 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277451 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277458 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277465 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277476 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277483 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277489 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277497 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277504 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277510 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277516 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277524 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277529 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277537 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277544 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277553 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277560 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277597 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277605 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277613 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277620 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277629 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277635 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277645 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277651 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277659 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277666 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277679 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277685 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277692 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277708 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277723 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277735 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277761 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277770 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277778 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277785 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277794 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277800 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277806 4834 feature_gate.go:330] unrecognized feature gate: Example Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.277813 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.277827 4834 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.285530 4834 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.285557 4834 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285607 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285615 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285620 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285624 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285628 4834 feature_gate.go:330] unrecognized feature gate: Example Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285632 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285637 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285642 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285648 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285652 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285656 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285660 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285663 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285667 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285671 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285674 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285678 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285681 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285687 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285692 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285695 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285699 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285702 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285706 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285709 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285713 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285716 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285719 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285723 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285726 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285731 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285735 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285747 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285751 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285755 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285759 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285762 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285766 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285769 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285773 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285777 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285783 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285787 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285791 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285795 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285799 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285803 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285807 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285810 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285814 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285818 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285822 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285826 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285832 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285836 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285841 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285847 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285852 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285857 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285860 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285864 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285868 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285871 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285875 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285879 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285884 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285888 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285891 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285895 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285899 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.285902 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.285909 4834 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286027 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286035 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286041 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286045 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286049 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286054 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286057 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286062 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286065 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286069 4834 feature_gate.go:330] unrecognized feature gate: Example Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286072 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286076 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286081 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286086 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286090 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286094 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286098 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286102 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286106 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286110 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286114 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286119 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286122 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286127 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286130 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286134 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286139 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286143 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286147 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286151 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286155 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286159 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286162 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286166 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286170 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286174 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286177 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286181 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286184 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286189 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286192 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286196 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286199 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286203 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286206 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286210 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286214 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286219 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286224 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286228 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286232 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286237 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286242 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286248 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286253 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286257 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286260 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286264 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286268 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286271 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286275 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286279 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286282 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286286 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286289 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286293 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286296 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286300 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286304 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286321 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.286325 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.286330 4834 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.286436 4834 server.go:940] "Client rotation is on, will bootstrap in background" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.289652 4834 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.289726 4834 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.290714 4834 server.go:997] "Starting client certificate rotation" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.290755 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.291001 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-07 04:07:26.704071816 +0000 UTC Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.291115 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 999h55m44.412959111s for next certificate rotation Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.304504 4834 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.305917 4834 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.316472 4834 log.go:25] "Validated CRI v1 runtime API" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.334605 4834 log.go:25] "Validated CRI v1 image API" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.335757 4834 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.339608 4834 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-26-12-08-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.339666 4834 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.355670 4834 manager.go:217] Machine: {Timestamp:2025-11-26 12:11:42.354217616 +0000 UTC m=+0.261430968 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445406 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a6b07d33-3cf7-4bfd-b095-28713e624c71 BootID:87769375-2601-4b87-b79e-f69016761287 Filesystems:[{Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e8:33:93 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:e8:33:93 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:b8:37:58 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:18:11:62 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:d1:ef:8b Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:e3:10:f6 Speed:-1 Mtu:1436} {Name:enp7s0.23 MacAddress:52:54:00:91:24:d9 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:0e:be:81:03:29:73 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:72:30:ad:4e:56:d9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.355900 4834 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.356047 4834 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.357220 4834 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.357707 4834 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.357772 4834 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.358043 4834 topology_manager.go:138] "Creating topology manager with none policy" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.358055 4834 container_manager_linux.go:303] "Creating device plugin manager" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.358495 4834 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.358532 4834 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.358710 4834 state_mem.go:36] "Initialized new in-memory state store" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.358836 4834 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.360411 4834 kubelet.go:418] "Attempting to sync node with API server" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.360439 4834 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.360468 4834 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.360483 4834 kubelet.go:324] "Adding apiserver pod source" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.360498 4834 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.363064 4834 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.363804 4834 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.364364 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.364472 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.148:6443: connect: connection refused" logger="UnhandledError" Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.364470 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.364572 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.148:6443: connect: connection refused" logger="UnhandledError" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.365283 4834 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366282 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366326 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366336 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366344 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366358 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366368 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366375 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366390 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366399 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366410 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366422 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366431 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.366829 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.367350 4834 server.go:1280] "Started kubelet" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.367536 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.368067 4834 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.368067 4834 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.368733 4834 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 26 12:11:42 crc systemd[1]: Started Kubernetes Kubelet. Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.370375 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.370409 4834 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.371529 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 06:39:55.024349029 +0000 UTC Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.371626 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 714h28m12.652726704s for next certificate rotation Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.372479 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="200ms" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.372648 4834 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.372963 4834 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.372053 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.148:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b8d5b4fdff36d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 12:11:42.367298413 +0000 UTC m=+0.274511766,LastTimestamp:2025-11-26 12:11:42.367298413 +0000 UTC m=+0.274511766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.372670 4834 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.373207 4834 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.373586 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.373653 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.148:6443: connect: connection refused" logger="UnhandledError" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.374211 4834 server.go:460] "Adding debug handlers to kubelet server" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.375224 4834 factory.go:55] Registering systemd factory Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.375250 4834 factory.go:221] Registration of the systemd container factory successfully Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.378189 4834 factory.go:153] Registering CRI-O factory Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.378238 4834 factory.go:221] Registration of the crio container factory successfully Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.378301 4834 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.378346 4834 factory.go:103] Registering Raw factory Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.378363 4834 manager.go:1196] Started watching for new ooms in manager Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.378981 4834 manager.go:319] Starting recovery of all containers Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384581 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384653 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384666 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384679 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384689 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384700 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384710 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384721 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384734 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384756 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384767 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384779 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384792 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384809 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384824 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384837 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384848 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384860 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384871 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384882 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384892 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384902 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384919 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384930 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384942 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384953 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384965 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.384979 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385021 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385036 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385049 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385064 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385097 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385108 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385118 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385131 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385141 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385151 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385162 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385172 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385182 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385192 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385203 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385214 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385223 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385236 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385247 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385260 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385271 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385282 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385292 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385302 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385331 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385342 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385352 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385362 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385372 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385381 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385391 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385402 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385411 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385420 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385430 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385443 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385452 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385461 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385470 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385481 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385490 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385499 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385510 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385521 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385531 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385541 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385553 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385563 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385573 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385582 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385592 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385601 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385611 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385620 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385628 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385638 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385648 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385656 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385664 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385671 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385679 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385689 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385697 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385706 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385716 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385723 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385734 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385755 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385764 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385775 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385788 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385800 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385810 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385821 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385829 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385838 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385890 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385901 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385918 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385930 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385940 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385950 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385960 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385971 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385983 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.385994 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386004 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386014 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386027 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386038 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386048 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386057 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386068 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386077 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386086 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386095 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386104 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386113 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386123 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386134 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386142 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386151 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386161 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386171 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386179 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386190 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386201 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386215 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386224 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386233 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386245 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386255 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386264 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386274 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386284 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.386295 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387284 4834 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387323 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387338 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387350 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387362 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387371 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387383 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387393 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387403 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387412 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387431 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387440 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387453 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387464 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387473 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387485 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387531 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387542 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387553 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387563 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387576 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387587 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387608 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387618 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387627 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387637 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387650 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387661 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387670 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387683 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387694 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387708 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387718 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387729 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387749 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387761 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387772 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387783 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387796 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387807 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387818 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387829 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387840 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387851 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387865 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387879 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387891 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387903 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387914 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387927 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387937 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387948 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387958 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387971 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387984 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.387995 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388006 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388019 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388031 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388041 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388052 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388063 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388074 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388087 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388097 4834 reconstruct.go:97] "Volume reconstruction finished" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.388106 4834 reconciler.go:26] "Reconciler: start to sync state" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.402845 4834 manager.go:324] Recovery completed Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.412005 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.413826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.413871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.413884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.414134 4834 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.414793 4834 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.414814 4834 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.414835 4834 state_mem.go:36] "Initialized new in-memory state store" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.415797 4834 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.415842 4834 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.415875 4834 kubelet.go:2335] "Starting kubelet main sync loop" Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.415916 4834 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.417225 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.417396 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.148:6443: connect: connection refused" logger="UnhandledError" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.418401 4834 policy_none.go:49] "None policy: Start" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.421660 4834 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.421689 4834 state_mem.go:35] "Initializing new in-memory state store" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.466267 4834 manager.go:334] "Starting Device Plugin manager" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.466471 4834 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.466571 4834 server.go:79] "Starting device plugin registration server" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.467239 4834 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.467358 4834 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.467604 4834 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.467700 4834 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.467723 4834 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.481017 4834 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.516290 4834 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.516402 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.517667 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.517716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.517731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.517939 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.518057 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.518125 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.518988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.519022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.519034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.519180 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.519413 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.519483 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.519625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.519674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.519686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.520175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.520197 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.520206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.520338 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.520732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.520780 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.520806 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.520813 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.520820 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.521200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.521234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.521246 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.521376 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.521508 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.521544 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.521782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.521816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.521831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.525128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.525179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.525206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.525485 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.525524 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.525569 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.525591 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.525605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.526501 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.526535 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.526548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.567784 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.569046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.569105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.569117 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.569153 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.569861 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.148:6443: connect: connection refused" node="crc" Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.574272 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="400ms" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590025 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590091 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590119 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590140 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590164 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590187 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590206 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590229 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590252 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590305 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590371 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590403 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590438 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590465 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.590491 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691618 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691665 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691684 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691700 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691714 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691732 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691761 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691776 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691790 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691785 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691809 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691824 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691841 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691856 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691872 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691877 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691887 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691910 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691938 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691964 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.691988 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.692011 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.692037 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.692064 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.692088 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.692114 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.692138 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.692161 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.692206 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.692236 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.769984 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.771961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.772082 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.772177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.772269 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.773285 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.148:6443: connect: connection refused" node="crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.858452 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.865338 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.878597 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.888057 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-fd8afa951bc71a2613315a5111b1b469151821bfaf892ba3c48a2113b95b1bb7 WatchSource:0}: Error finding container fd8afa951bc71a2613315a5111b1b469151821bfaf892ba3c48a2113b95b1bb7: Status 404 returned error can't find the container with id fd8afa951bc71a2613315a5111b1b469151821bfaf892ba3c48a2113b95b1bb7 Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.889828 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-09f182fbe1e74cd37a6ad4f24d6170e16b87f58bb132c25ab6d3702f8d0b7b9b WatchSource:0}: Error finding container 09f182fbe1e74cd37a6ad4f24d6170e16b87f58bb132c25ab6d3702f8d0b7b9b: Status 404 returned error can't find the container with id 09f182fbe1e74cd37a6ad4f24d6170e16b87f58bb132c25ab6d3702f8d0b7b9b Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.892207 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-f9721011c541d6140848101bfdb75128117ed99c11f722a1191961c3c80059f1 WatchSource:0}: Error finding container f9721011c541d6140848101bfdb75128117ed99c11f722a1191961c3c80059f1: Status 404 returned error can't find the container with id f9721011c541d6140848101bfdb75128117ed99c11f722a1191961c3c80059f1 Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.894498 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: I1126 12:11:42.899159 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.906786 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d5ff5fd71247956f360a97f7df2b6d34040adc8c0167ce8caeb82b83babad973 WatchSource:0}: Error finding container d5ff5fd71247956f360a97f7df2b6d34040adc8c0167ce8caeb82b83babad973: Status 404 returned error can't find the container with id d5ff5fd71247956f360a97f7df2b6d34040adc8c0167ce8caeb82b83babad973 Nov 26 12:11:42 crc kubenswrapper[4834]: W1126 12:11:42.921018 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-deb216bf222c1e13b5d774b79bf7f24433bf83539454d60b7cca23769aa4381b WatchSource:0}: Error finding container deb216bf222c1e13b5d774b79bf7f24433bf83539454d60b7cca23769aa4381b: Status 404 returned error can't find the container with id deb216bf222c1e13b5d774b79bf7f24433bf83539454d60b7cca23769aa4381b Nov 26 12:11:42 crc kubenswrapper[4834]: E1126 12:11:42.975820 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="800ms" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.174214 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.176206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.176262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.176274 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.176351 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 12:11:43 crc kubenswrapper[4834]: E1126 12:11:43.176813 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.148:6443: connect: connection refused" node="crc" Nov 26 12:11:43 crc kubenswrapper[4834]: W1126 12:11:43.321149 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:43 crc kubenswrapper[4834]: E1126 12:11:43.321244 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.148:6443: connect: connection refused" logger="UnhandledError" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.368703 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.420219 4834 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="99d6f06b314de361e81e7b6be89722a06fe4a2c0c3418d3a3b0c8254babe1baf" exitCode=0 Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.420307 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"99d6f06b314de361e81e7b6be89722a06fe4a2c0c3418d3a3b0c8254babe1baf"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.420658 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"09f182fbe1e74cd37a6ad4f24d6170e16b87f58bb132c25ab6d3702f8d0b7b9b"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.420771 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.421654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.421695 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.421745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.422097 4834 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427" exitCode=0 Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.422147 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.422192 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"deb216bf222c1e13b5d774b79bf7f24433bf83539454d60b7cca23769aa4381b"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.422349 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.423118 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.423145 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.423153 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.423711 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.423753 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5ff5fd71247956f360a97f7df2b6d34040adc8c0167ce8caeb82b83babad973"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.425388 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0" exitCode=0 Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.425449 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.425472 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9721011c541d6140848101bfdb75128117ed99c11f722a1191961c3c80059f1"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.425602 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.426455 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.426480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.426490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.427419 4834 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3" exitCode=0 Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.427442 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.427475 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fd8afa951bc71a2613315a5111b1b469151821bfaf892ba3c48a2113b95b1bb7"} Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.427594 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.427896 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.428429 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.428473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.428486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.428823 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.428853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.428864 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:43 crc kubenswrapper[4834]: W1126 12:11:43.431333 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:43 crc kubenswrapper[4834]: E1126 12:11:43.431476 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.148:6443: connect: connection refused" logger="UnhandledError" Nov 26 12:11:43 crc kubenswrapper[4834]: W1126 12:11:43.495445 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:43 crc kubenswrapper[4834]: E1126 12:11:43.495535 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.148:6443: connect: connection refused" logger="UnhandledError" Nov 26 12:11:43 crc kubenswrapper[4834]: W1126 12:11:43.774629 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.148:6443: connect: connection refused Nov 26 12:11:43 crc kubenswrapper[4834]: E1126 12:11:43.774717 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.148:6443: connect: connection refused" logger="UnhandledError" Nov 26 12:11:43 crc kubenswrapper[4834]: E1126 12:11:43.776411 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="1.6s" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.977299 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.978707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.978741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.978751 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:43 crc kubenswrapper[4834]: I1126 12:11:43.978775 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 12:11:43 crc kubenswrapper[4834]: E1126 12:11:43.979161 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.148:6443: connect: connection refused" node="crc" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.431801 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"66866c33a678a8db1b8f2aad639dfb35a4db4c640e07262f340fb0d35025b486"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.431995 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.433098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.433124 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.433133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.434106 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.434168 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.434182 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.434287 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.434976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.434998 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.435007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.436291 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.436337 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.436351 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.436391 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.437323 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.437348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.437358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.439989 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.440027 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.440042 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.440054 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.440065 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.440184 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.440903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.440933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.440991 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.442264 4834 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b" exitCode=0 Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.442332 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b"} Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.442496 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.443206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.443229 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:44 crc kubenswrapper[4834]: I1126 12:11:44.443261 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.399347 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.446441 4834 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6" exitCode=0 Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.446871 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.446979 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.447252 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6"} Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.447434 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.447498 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.448022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.448051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.448061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.448623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.448671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.448685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.448641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.448757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.448768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.579626 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.580573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.580654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.580716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:45 crc kubenswrapper[4834]: I1126 12:11:45.580790 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.386179 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.453491 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4"} Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.453573 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6"} Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.453589 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093"} Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.453593 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.453601 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c"} Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.453766 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.453760 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf"} Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.454895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.454924 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.454942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.454950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.454962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:46 crc kubenswrapper[4834]: I1126 12:11:46.454963 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.197062 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.197296 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.198396 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.198442 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.198453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.457808 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.458780 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.458816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.458826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.561245 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.561458 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.562785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.562822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:47 crc kubenswrapper[4834]: I1126 12:11:47.562833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:48 crc kubenswrapper[4834]: I1126 12:11:48.771550 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:48 crc kubenswrapper[4834]: I1126 12:11:48.771789 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:48 crc kubenswrapper[4834]: I1126 12:11:48.773340 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:48 crc kubenswrapper[4834]: I1126 12:11:48.773394 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:48 crc kubenswrapper[4834]: I1126 12:11:48.773406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:49 crc kubenswrapper[4834]: I1126 12:11:49.177490 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 26 12:11:49 crc kubenswrapper[4834]: I1126 12:11:49.177674 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:49 crc kubenswrapper[4834]: I1126 12:11:49.178905 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:49 crc kubenswrapper[4834]: I1126 12:11:49.178954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:49 crc kubenswrapper[4834]: I1126 12:11:49.178964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:49 crc kubenswrapper[4834]: I1126 12:11:49.386963 4834 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 12:11:49 crc kubenswrapper[4834]: I1126 12:11:49.387042 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 12:11:51 crc kubenswrapper[4834]: I1126 12:11:51.551562 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:11:51 crc kubenswrapper[4834]: I1126 12:11:51.551754 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:51 crc kubenswrapper[4834]: I1126 12:11:51.552800 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:51 crc kubenswrapper[4834]: I1126 12:11:51.552838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:51 crc kubenswrapper[4834]: I1126 12:11:51.552849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:52 crc kubenswrapper[4834]: E1126 12:11:52.481783 4834 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 12:11:53 crc kubenswrapper[4834]: I1126 12:11:53.547735 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:53 crc kubenswrapper[4834]: I1126 12:11:53.547935 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:53 crc kubenswrapper[4834]: I1126 12:11:53.549033 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:53 crc kubenswrapper[4834]: I1126 12:11:53.549091 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:53 crc kubenswrapper[4834]: I1126 12:11:53.549104 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:53 crc kubenswrapper[4834]: I1126 12:11:53.847358 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:53 crc kubenswrapper[4834]: I1126 12:11:53.853000 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.369662 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.474178 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.474999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.475058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.475071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.477896 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.807280 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.807366 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.810517 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.810553 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.869826 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.870016 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.871169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.871211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:54 crc kubenswrapper[4834]: I1126 12:11:54.871243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:55 crc kubenswrapper[4834]: I1126 12:11:55.482774 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:55 crc kubenswrapper[4834]: I1126 12:11:55.483752 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:55 crc kubenswrapper[4834]: I1126 12:11:55.483799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:55 crc kubenswrapper[4834]: I1126 12:11:55.483810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:56 crc kubenswrapper[4834]: I1126 12:11:56.485365 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:56 crc kubenswrapper[4834]: I1126 12:11:56.489084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:56 crc kubenswrapper[4834]: I1126 12:11:56.489124 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:56 crc kubenswrapper[4834]: I1126 12:11:56.489138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:57 crc kubenswrapper[4834]: I1126 12:11:57.565619 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:57 crc kubenswrapper[4834]: I1126 12:11:57.565779 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:57 crc kubenswrapper[4834]: I1126 12:11:57.566426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:57 crc kubenswrapper[4834]: I1126 12:11:57.566459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:57 crc kubenswrapper[4834]: I1126 12:11:57.566470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:57 crc kubenswrapper[4834]: I1126 12:11:57.569537 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:11:58 crc kubenswrapper[4834]: I1126 12:11:58.490525 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 12:11:58 crc kubenswrapper[4834]: I1126 12:11:58.490587 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:11:58 crc kubenswrapper[4834]: I1126 12:11:58.491396 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:11:58 crc kubenswrapper[4834]: I1126 12:11:58.491427 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:11:58 crc kubenswrapper[4834]: I1126 12:11:58.491438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.387186 4834 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.387266 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.795988 4834 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 26 12:11:59 crc kubenswrapper[4834]: E1126 12:11:59.800382 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.801844 4834 trace.go:236] Trace[408748409]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 12:11:45.813) (total time: 13988ms): Nov 26 12:11:59 crc kubenswrapper[4834]: Trace[408748409]: ---"Objects listed" error: 13988ms (12:11:59.801) Nov 26 12:11:59 crc kubenswrapper[4834]: Trace[408748409]: [13.988351752s] [13.988351752s] END Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.801899 4834 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.802533 4834 trace.go:236] Trace[474264679]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 12:11:45.437) (total time: 14364ms): Nov 26 12:11:59 crc kubenswrapper[4834]: Trace[474264679]: ---"Objects listed" error: 14364ms (12:11:59.802) Nov 26 12:11:59 crc kubenswrapper[4834]: Trace[474264679]: [14.364528336s] [14.364528336s] END Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.802566 4834 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 26 12:11:59 crc kubenswrapper[4834]: E1126 12:11:59.807179 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.807514 4834 trace.go:236] Trace[264450845]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 12:11:45.885) (total time: 13921ms): Nov 26 12:11:59 crc kubenswrapper[4834]: Trace[264450845]: ---"Objects listed" error: 13921ms (12:11:59.807) Nov 26 12:11:59 crc kubenswrapper[4834]: Trace[264450845]: [13.921837186s] [13.921837186s] END Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.807543 4834 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.808368 4834 trace.go:236] Trace[1432921224]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 12:11:45.903) (total time: 13904ms): Nov 26 12:11:59 crc kubenswrapper[4834]: Trace[1432921224]: ---"Objects listed" error: 13904ms (12:11:59.808) Nov 26 12:11:59 crc kubenswrapper[4834]: Trace[1432921224]: [13.904998559s] [13.904998559s] END Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.808396 4834 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.824734 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40262->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.824804 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40262->192.168.126.11:17697: read: connection reset by peer" Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.824957 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40272->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.824982 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40272->192.168.126.11:17697: read: connection reset by peer" Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.825293 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 26 12:11:59 crc kubenswrapper[4834]: I1126 12:11:59.825365 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.372822 4834 apiserver.go:52] "Watching apiserver" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.375966 4834 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.376366 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.376861 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.376884 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.377079 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.376953 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.377193 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.377491 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.377563 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.377571 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.377883 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.378718 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.378758 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.379344 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.380060 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.381245 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.381783 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.385060 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.385097 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.386596 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.402738 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.413110 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.424528 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.435796 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.442540 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.449424 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.455743 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.474334 4834 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.496106 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.497787 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26" exitCode=255 Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.497826 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26"} Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498260 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498298 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498332 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498353 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498371 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498385 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498401 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498417 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498434 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498452 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498482 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498498 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498515 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498532 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498547 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498563 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498578 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498594 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498611 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498608 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498625 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498641 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498698 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498714 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498732 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498747 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498761 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498775 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498804 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498819 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498834 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498840 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.498857 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499042 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499094 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499222 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499278 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499328 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499423 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499460 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499536 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499566 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499586 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499603 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499621 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499644 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499649 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499663 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499680 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499697 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499715 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499713 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499732 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499751 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499767 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499786 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499803 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499819 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499837 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499867 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499887 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499902 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499918 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499967 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499986 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500004 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.499989 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500018 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500035 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500052 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500067 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500082 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500097 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500114 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500130 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500166 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500182 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500203 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500220 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500237 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500251 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500267 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500285 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500301 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500345 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500363 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500380 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500397 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500413 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500429 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500446 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500464 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500480 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500498 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500513 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500528 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500545 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500562 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500578 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500594 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500612 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500628 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500642 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500659 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500674 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500689 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500704 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500737 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500753 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500769 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500786 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.500802 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501020 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501058 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501100 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501185 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501324 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501411 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501427 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501473 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501502 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.501518 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502121 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502155 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502191 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502231 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502333 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502460 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502499 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502517 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502632 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502882 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502907 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504206 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503230 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503237 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503277 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503327 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503340 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504389 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504407 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504444 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503460 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503561 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504477 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503646 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503665 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503694 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503756 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503459 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.503952 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504000 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504046 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504113 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504683 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.502940 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504721 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504734 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504836 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504872 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504893 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504915 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504933 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504949 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504966 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504984 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.505000 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.505019 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.505036 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.505052 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.505069 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.505616 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.505670 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506171 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506229 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506253 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506581 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506670 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506697 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506719 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506739 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506757 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506773 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506789 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506805 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506822 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506839 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506870 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506890 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506906 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506923 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506939 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504788 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506957 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.504827 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506969 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.505208 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.505224 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506009 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506413 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506513 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506789 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506861 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.507098 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.507271 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.507468 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.507553 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.507590 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.507644 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.507646 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506941 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.506975 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.507887 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.507923 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508049 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508120 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508189 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508214 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508256 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508333 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508358 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508375 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508380 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508414 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508434 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508456 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508594 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508612 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508707 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508691 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508789 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508812 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508831 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508856 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508875 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508894 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508912 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508930 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508948 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508967 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508988 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509006 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.508991 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509028 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509049 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509065 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509082 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509101 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509113 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509121 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509138 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509064 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509169 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509190 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509195 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509257 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509392 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509414 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509431 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509507 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509527 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509545 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509592 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509612 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509628 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509651 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509668 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509686 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509688 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509702 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509743 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509903 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509914 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509931 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509930 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509950 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.510229 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.510602 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.510783 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511013 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511076 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511102 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.510978 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511296 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511432 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.509767 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511613 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511636 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511623 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511664 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511694 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511693 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511723 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511733 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511746 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511767 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511769 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511789 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511810 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511865 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511893 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511907 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511951 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511986 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512006 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512032 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512140 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512188 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512202 4834 scope.go:117] "RemoveContainer" containerID="4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512232 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512260 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512285 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512326 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512351 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512374 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512427 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512448 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512476 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512543 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512572 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512598 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512899 4834 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512917 4834 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512928 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512941 4834 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512956 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512980 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512990 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513004 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513013 4834 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513023 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513033 4834 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513117 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513126 4834 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513135 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513147 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513160 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513170 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513179 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513190 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513202 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513213 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513225 4834 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513240 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513249 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513259 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513268 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513281 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513290 4834 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513299 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513324 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513340 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513350 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513359 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513374 4834 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513383 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513394 4834 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513402 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513416 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513429 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513441 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513450 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513462 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513471 4834 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513500 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513509 4834 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513522 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513531 4834 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513542 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513556 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513565 4834 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513575 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513584 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513596 4834 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513606 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513618 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513630 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513642 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513652 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513661 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513671 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513684 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513694 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513703 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513718 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513726 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513738 4834 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513748 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513762 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513771 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513781 4834 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513791 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513804 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513813 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513823 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513836 4834 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513905 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513915 4834 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513924 4834 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513936 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513946 4834 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513956 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513966 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513978 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513987 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513996 4834 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514004 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514017 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514026 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514036 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514048 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514057 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514066 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514075 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514087 4834 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514097 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514109 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514119 4834 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514131 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514140 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514152 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514163 4834 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514174 4834 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514183 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514192 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514205 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514214 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514223 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514233 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511949 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512012 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512063 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512118 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512368 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512448 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512720 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.512934 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.519328 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:12:01.019276717 +0000 UTC m=+18.926490069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.519382 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.519463 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.519059 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513421 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513479 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513643 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.513833 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514081 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514295 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.519677 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.514795 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.515006 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.515095 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.515346 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.515379 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.510932 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516125 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516297 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516366 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516495 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516568 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.511382 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516667 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516772 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516783 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516786 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516832 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.516832 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.517878 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.517887 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.517890 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.518206 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.518231 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.518469 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.518554 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.518577 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.518144 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.519950 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.520016 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:01.019994378 +0000 UTC m=+18.927207730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.518866 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.518947 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.519168 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.519811 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.520227 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:01.020214819 +0000 UTC m=+18.927428172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.520294 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.520371 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.520373 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.520547 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.521061 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.521550 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.520032 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.527055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.527068 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.527364 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.527573 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.527652 4834 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.527681 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.527722 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.527764 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.527994 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.528015 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.528053 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:01.028042592 +0000 UTC m=+18.935255944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.528187 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.528505 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.528545 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.529054 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.529081 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.529095 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:00 crc kubenswrapper[4834]: E1126 12:12:00.529143 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:01.029127478 +0000 UTC m=+18.936340830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.529643 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.529736 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.530176 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.530178 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.530416 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.530448 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.530465 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.530576 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.533212 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.533254 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.535705 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.536109 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.537684 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.537887 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.538791 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.539225 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.539254 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.539462 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.540057 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.540245 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.540539 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.540681 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.540752 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.542365 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.542862 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.543183 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.548198 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.549330 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.555721 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.558582 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.561235 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.566489 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.573733 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.579444 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615016 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615062 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615120 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615134 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615147 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615158 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615168 4834 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615179 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615194 4834 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615208 4834 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615196 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615219 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615296 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615295 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615335 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615351 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615365 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615376 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615385 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615394 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615404 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615415 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615430 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615442 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615456 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615468 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615480 4834 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615492 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615502 4834 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615512 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615522 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615533 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615544 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615554 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615564 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615573 4834 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615581 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615590 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615600 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615609 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615619 4834 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615629 4834 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615638 4834 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615648 4834 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615657 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615665 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615675 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615684 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615694 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615737 4834 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615747 4834 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615757 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615766 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615776 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615785 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615794 4834 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615804 4834 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615815 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615825 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615839 4834 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615862 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615870 4834 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615879 4834 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615887 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615898 4834 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615906 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615914 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615923 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615932 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615941 4834 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615950 4834 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615959 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615967 4834 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615977 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615987 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.615995 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616004 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616012 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616021 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616029 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616038 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616046 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616054 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616063 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616072 4834 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616082 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616092 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616101 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616109 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.616118 4834 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.692664 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.698694 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 12:12:00 crc kubenswrapper[4834]: I1126 12:12:00.704381 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 12:12:00 crc kubenswrapper[4834]: W1126 12:12:00.714521 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d5904c2f63c3b16f2c0e09383b7fdfb1de96d29620cafb0ad41d48288c85e329 WatchSource:0}: Error finding container d5904c2f63c3b16f2c0e09383b7fdfb1de96d29620cafb0ad41d48288c85e329: Status 404 returned error can't find the container with id d5904c2f63c3b16f2c0e09383b7fdfb1de96d29620cafb0ad41d48288c85e329 Nov 26 12:12:00 crc kubenswrapper[4834]: W1126 12:12:00.720373 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8c1682a91ac64f85c02e432c235b8b09bc3890cd2ab21e73d985e8aec4eb3b8d WatchSource:0}: Error finding container 8c1682a91ac64f85c02e432c235b8b09bc3890cd2ab21e73d985e8aec4eb3b8d: Status 404 returned error can't find the container with id 8c1682a91ac64f85c02e432c235b8b09bc3890cd2ab21e73d985e8aec4eb3b8d Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.019663 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.019874 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:12:02.019824154 +0000 UTC m=+19.927037516 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.120974 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.121019 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.121049 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.121070 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121172 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121241 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121263 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121280 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121290 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:02.121266169 +0000 UTC m=+20.028479522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121355 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:02.121340528 +0000 UTC m=+20.028553881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121358 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121439 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:02.121427631 +0000 UTC m=+20.028640993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121242 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121516 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121528 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.121585 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:02.121576108 +0000 UTC m=+20.028789471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.416944 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:01 crc kubenswrapper[4834]: E1126 12:12:01.417072 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.502346 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be"} Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.502396 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18"} Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.502406 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8c1682a91ac64f85c02e432c235b8b09bc3890cd2ab21e73d985e8aec4eb3b8d"} Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.504091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d5904c2f63c3b16f2c0e09383b7fdfb1de96d29620cafb0ad41d48288c85e329"} Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.506702 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504"} Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.506731 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"06220f9a8e6b1f4b51ba9b98081fe29a21106e30f61a1f37dd8605a4bcb2f133"} Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.508141 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.509467 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72"} Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.509608 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.518388 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.528333 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.537455 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.547636 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.557599 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.568479 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.578325 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.590083 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.602451 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.611586 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.622266 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.639105 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.648082 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:01 crc kubenswrapper[4834]: I1126 12:12:01.659089 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:01Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.028009 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.028231 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:12:04.028199014 +0000 UTC m=+21.935412376 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.128673 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.128716 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.128739 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.128758 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.128849 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.128889 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.128910 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.128922 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.128936 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:04.128915765 +0000 UTC m=+22.036129117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.128974 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:04.128959828 +0000 UTC m=+22.036173180 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.129017 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.129047 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:04.129041159 +0000 UTC m=+22.036254511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.129087 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.129096 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.129103 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.129122 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:04.129114577 +0000 UTC m=+22.036327919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.416693 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.416704 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.416873 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:02 crc kubenswrapper[4834]: E1126 12:12:02.416984 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.420410 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.420967 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.422138 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.422741 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.423638 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.424131 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.424662 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.425543 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.426110 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.426953 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.427419 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.428450 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.428907 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.429385 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.429678 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.430210 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.430708 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.431555 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.431921 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.432436 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.433347 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.433753 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.434641 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.435056 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.435989 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.436463 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.437026 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.437997 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.438433 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.439265 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.439733 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.440558 4834 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.440655 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.440939 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.442151 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.443091 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.443495 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.444835 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.445403 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.446164 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.446903 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.448245 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.449042 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.449695 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.450022 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.450404 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.451020 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.451512 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.452084 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.452669 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.453396 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.453846 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.454287 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.454767 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.455278 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.455857 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.456420 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.460085 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.469003 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.479251 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:02 crc kubenswrapper[4834]: I1126 12:12:02.489777 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.008249 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.009604 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.009644 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.009654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.009721 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.015354 4834 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.015583 4834 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.016505 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.016556 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.016569 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.016589 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.016602 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: E1126 12:12:03.030143 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.033636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.033662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.033671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.033687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.033698 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: E1126 12:12:03.043431 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.045629 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.045662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.045674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.045692 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.045702 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: E1126 12:12:03.053723 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.055997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.056020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.056030 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.056040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.056048 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: E1126 12:12:03.064227 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.066966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.066993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.067002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.067013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.067035 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: E1126 12:12:03.075038 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: E1126 12:12:03.075147 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.076480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.076521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.076532 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.076541 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.076548 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.178337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.178466 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.178482 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.178502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.178513 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.280516 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.280571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.280585 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.280607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.280618 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.383262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.383325 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.383341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.383366 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.383377 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.416783 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:03 crc kubenswrapper[4834]: E1126 12:12:03.416928 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.485672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.485726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.485737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.485758 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.485768 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.515180 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.526072 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.535999 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.547025 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.557131 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.566774 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.576352 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.585808 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:03Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.587482 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.587520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.587532 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.587549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.587559 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.689368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.689403 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.689413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.689425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.689435 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.792726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.792781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.792803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.792822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.792836 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.895633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.895678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.895689 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.895706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.895718 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.998246 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.998292 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.998303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.998336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:03 crc kubenswrapper[4834]: I1126 12:12:03.998353 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:03Z","lastTransitionTime":"2025-11-26T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.045776 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.045924 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:12:08.045889853 +0000 UTC m=+25.953103205 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.081069 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5gvrf"] Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.081481 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4rwmt"] Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.081595 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5gvrf" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.082004 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.084163 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.084255 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.084390 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.085231 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.085268 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.085476 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.086329 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.100264 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.100299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.100331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.100348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.100358 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:04Z","lastTransitionTime":"2025-11-26T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.101107 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.118983 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.136391 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.147193 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.147235 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx8mg\" (UniqueName: \"kubernetes.io/projected/b7c651b3-b6f5-4af8-9cc7-4728f137227a-kube-api-access-gx8mg\") pod \"node-ca-4rwmt\" (UID: \"b7c651b3-b6f5-4af8-9cc7-4728f137227a\") " pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.147259 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.147275 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7c651b3-b6f5-4af8-9cc7-4728f137227a-host\") pod \"node-ca-4rwmt\" (UID: \"b7c651b3-b6f5-4af8-9cc7-4728f137227a\") " pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.147292 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5a5b7690-00d4-4ca0-8d22-b236f5d25580-hosts-file\") pod \"node-resolver-5gvrf\" (UID: \"5a5b7690-00d4-4ca0-8d22-b236f5d25580\") " pod="openshift-dns/node-resolver-5gvrf" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.147322 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc2sj\" (UniqueName: \"kubernetes.io/projected/5a5b7690-00d4-4ca0-8d22-b236f5d25580-kube-api-access-dc2sj\") pod \"node-resolver-5gvrf\" (UID: \"5a5b7690-00d4-4ca0-8d22-b236f5d25580\") " pod="openshift-dns/node-resolver-5gvrf" Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147406 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147439 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147443 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147457 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.147468 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147507 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:08.147490815 +0000 UTC m=+26.054704168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147527 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147458 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147548 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147555 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:08.147547161 +0000 UTC m=+26.054760513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.147525 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147570 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.147580 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b7c651b3-b6f5-4af8-9cc7-4728f137227a-serviceca\") pod \"node-ca-4rwmt\" (UID: \"b7c651b3-b6f5-4af8-9cc7-4728f137227a\") " pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147593 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:08.147587747 +0000 UTC m=+26.054801099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.147626 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:08.147618614 +0000 UTC m=+26.054831966 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.159822 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.161925 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-k8hjt"] Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.162208 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: W1126 12:12:04.164969 4834 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.165001 4834 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.165491 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.165664 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.165887 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.166606 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.179376 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.189033 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.196271 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.202485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.202514 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.202524 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.202538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.202546 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:04Z","lastTransitionTime":"2025-11-26T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.206141 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.216296 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.228604 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.239645 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.248909 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc2sj\" (UniqueName: \"kubernetes.io/projected/5a5b7690-00d4-4ca0-8d22-b236f5d25580-kube-api-access-dc2sj\") pod \"node-resolver-5gvrf\" (UID: \"5a5b7690-00d4-4ca0-8d22-b236f5d25580\") " pod="openshift-dns/node-resolver-5gvrf" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.248969 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlxwn\" (UniqueName: \"kubernetes.io/projected/234b786b-76dd-4238-81bd-a743042bece9-kube-api-access-vlxwn\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.248999 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5a5b7690-00d4-4ca0-8d22-b236f5d25580-hosts-file\") pod \"node-resolver-5gvrf\" (UID: \"5a5b7690-00d4-4ca0-8d22-b236f5d25580\") " pod="openshift-dns/node-resolver-5gvrf" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249020 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-system-cni-dir\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249039 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-cnibin\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249112 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-os-release\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249295 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5a5b7690-00d4-4ca0-8d22-b236f5d25580-hosts-file\") pod \"node-resolver-5gvrf\" (UID: \"5a5b7690-00d4-4ca0-8d22-b236f5d25580\") " pod="openshift-dns/node-resolver-5gvrf" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249295 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-multus-conf-dir\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249405 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-etc-kubernetes\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249434 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-multus-socket-dir-parent\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249456 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-hostroot\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249495 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b7c651b3-b6f5-4af8-9cc7-4728f137227a-serviceca\") pod \"node-ca-4rwmt\" (UID: \"b7c651b3-b6f5-4af8-9cc7-4728f137227a\") " pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249516 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-var-lib-cni-multus\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249537 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-run-multus-certs\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249558 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-run-netns\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249580 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-var-lib-cni-bin\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249599 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/234b786b-76dd-4238-81bd-a743042bece9-cni-binary-copy\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249664 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-var-lib-kubelet\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249739 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx8mg\" (UniqueName: \"kubernetes.io/projected/b7c651b3-b6f5-4af8-9cc7-4728f137227a-kube-api-access-gx8mg\") pod \"node-ca-4rwmt\" (UID: \"b7c651b3-b6f5-4af8-9cc7-4728f137227a\") " pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249768 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-run-k8s-cni-cncf-io\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249829 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/234b786b-76dd-4238-81bd-a743042bece9-multus-daemon-config\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.249953 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7c651b3-b6f5-4af8-9cc7-4728f137227a-host\") pod \"node-ca-4rwmt\" (UID: \"b7c651b3-b6f5-4af8-9cc7-4728f137227a\") " pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.250005 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-multus-cni-dir\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.250147 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7c651b3-b6f5-4af8-9cc7-4728f137227a-host\") pod \"node-ca-4rwmt\" (UID: \"b7c651b3-b6f5-4af8-9cc7-4728f137227a\") " pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.251414 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b7c651b3-b6f5-4af8-9cc7-4728f137227a-serviceca\") pod \"node-ca-4rwmt\" (UID: \"b7c651b3-b6f5-4af8-9cc7-4728f137227a\") " pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.252714 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.270098 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx8mg\" (UniqueName: \"kubernetes.io/projected/b7c651b3-b6f5-4af8-9cc7-4728f137227a-kube-api-access-gx8mg\") pod \"node-ca-4rwmt\" (UID: \"b7c651b3-b6f5-4af8-9cc7-4728f137227a\") " pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.270377 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc2sj\" (UniqueName: \"kubernetes.io/projected/5a5b7690-00d4-4ca0-8d22-b236f5d25580-kube-api-access-dc2sj\") pod \"node-resolver-5gvrf\" (UID: \"5a5b7690-00d4-4ca0-8d22-b236f5d25580\") " pod="openshift-dns/node-resolver-5gvrf" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.272138 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.285619 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.296015 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.304742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.304786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.304799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.304815 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.304826 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:04Z","lastTransitionTime":"2025-11-26T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.305999 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.317884 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.325561 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.350888 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-run-k8s-cni-cncf-io\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.350925 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/234b786b-76dd-4238-81bd-a743042bece9-multus-daemon-config\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.350967 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-multus-cni-dir\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.350985 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlxwn\" (UniqueName: \"kubernetes.io/projected/234b786b-76dd-4238-81bd-a743042bece9-kube-api-access-vlxwn\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351003 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-system-cni-dir\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351017 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-multus-conf-dir\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351035 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-etc-kubernetes\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351036 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-run-k8s-cni-cncf-io\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351052 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-cnibin\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351098 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-cnibin\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351100 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-multus-conf-dir\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351112 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-system-cni-dir\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351125 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-os-release\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351175 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-os-release\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351192 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-multus-cni-dir\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351209 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-etc-kubernetes\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351253 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-multus-socket-dir-parent\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351223 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-multus-socket-dir-parent\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351323 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-hostroot\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351348 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-run-netns\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351371 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-var-lib-cni-bin\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351390 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-var-lib-cni-multus\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351406 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-run-netns\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351410 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-run-multus-certs\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351432 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-var-lib-cni-multus\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351438 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-var-lib-cni-bin\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351433 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-run-multus-certs\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351445 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/234b786b-76dd-4238-81bd-a743042bece9-cni-binary-copy\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351410 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-hostroot\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351483 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-var-lib-kubelet\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351526 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/234b786b-76dd-4238-81bd-a743042bece9-host-var-lib-kubelet\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.351582 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/234b786b-76dd-4238-81bd-a743042bece9-multus-daemon-config\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.377743 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlxwn\" (UniqueName: \"kubernetes.io/projected/234b786b-76dd-4238-81bd-a743042bece9-kube-api-access-vlxwn\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.391928 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5gvrf" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.397559 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4rwmt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.408086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.408119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.408133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.408147 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.408158 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:04Z","lastTransitionTime":"2025-11-26T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: W1126 12:12:04.409069 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7c651b3_b6f5_4af8_9cc7_4728f137227a.slice/crio-d36d30c3b335cbc136238f73b4d9715f3467222edeed45ffb9b7806337b60514 WatchSource:0}: Error finding container d36d30c3b335cbc136238f73b4d9715f3467222edeed45ffb9b7806337b60514: Status 404 returned error can't find the container with id d36d30c3b335cbc136238f73b4d9715f3467222edeed45ffb9b7806337b60514 Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.416860 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.416910 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.416963 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:04 crc kubenswrapper[4834]: E1126 12:12:04.417024 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.511161 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.511195 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.511204 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.511251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.511261 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:04Z","lastTransitionTime":"2025-11-26T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.521995 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5gvrf" event={"ID":"5a5b7690-00d4-4ca0-8d22-b236f5d25580","Type":"ContainerStarted","Data":"281d64c4335ff4ec480c571249969a6b87fd9f0fe5756bbf9ffa0f979d83a425"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.525437 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4rwmt" event={"ID":"b7c651b3-b6f5-4af8-9cc7-4728f137227a","Type":"ContainerStarted","Data":"d36d30c3b335cbc136238f73b4d9715f3467222edeed45ffb9b7806337b60514"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.536133 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-q5s5d"] Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.536753 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.538484 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.540814 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.553016 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.567817 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.577896 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.587533 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.597370 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.607092 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.613945 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.613985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.613997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.614014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.614024 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:04Z","lastTransitionTime":"2025-11-26T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.621441 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.629928 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.640858 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.650010 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.654174 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.654260 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-system-cni-dir\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.654284 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7910fe0-c205-465a-b8b5-9b56d8bb1941-cni-binary-copy\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.654393 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c7910fe0-c205-465a-b8b5-9b56d8bb1941-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.654526 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-os-release\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.654565 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-cnibin\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.654593 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrbq\" (UniqueName: \"kubernetes.io/projected/c7910fe0-c205-465a-b8b5-9b56d8bb1941-kube-api-access-xwrbq\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.661603 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.716296 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.716347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.716376 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.716393 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.716403 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:04Z","lastTransitionTime":"2025-11-26T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755384 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-system-cni-dir\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755424 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7910fe0-c205-465a-b8b5-9b56d8bb1941-cni-binary-copy\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755444 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c7910fe0-c205-465a-b8b5-9b56d8bb1941-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755472 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-os-release\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755488 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-cnibin\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755506 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrbq\" (UniqueName: \"kubernetes.io/projected/c7910fe0-c205-465a-b8b5-9b56d8bb1941-kube-api-access-xwrbq\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755536 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755543 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-system-cni-dir\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755609 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-cnibin\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.755612 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-os-release\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.756072 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7910fe0-c205-465a-b8b5-9b56d8bb1941-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.756137 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c7910fe0-c205-465a-b8b5-9b56d8bb1941-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.769522 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrbq\" (UniqueName: \"kubernetes.io/projected/c7910fe0-c205-465a-b8b5-9b56d8bb1941-kube-api-access-xwrbq\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.819282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.819347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.819361 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.819379 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.819392 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:04Z","lastTransitionTime":"2025-11-26T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.902570 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.905931 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9dvt4"] Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.906688 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.906819 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xzb52"] Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.907266 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.909182 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.909423 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.909561 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.909671 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.909749 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.909780 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.910137 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.910422 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.910468 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.910978 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.911523 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.911560 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.921258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.921301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.921327 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.921346 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.921357 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:04Z","lastTransitionTime":"2025-11-26T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.921785 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.925508 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.945891 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.960884 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.974630 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.990242 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:04 crc kubenswrapper[4834]: I1126 12:12:04.998360 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.008998 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.019430 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.023793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.023819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.023830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.023844 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.023855 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.028988 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.039960 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.050591 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.057267 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-openvswitch\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.057485 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.057598 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-config\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.057680 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-env-overrides\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.057750 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-ovn\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.057838 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-bin\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.057899 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-netd\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.057994 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-netns\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058081 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058153 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v8l4\" (UniqueName: \"kubernetes.io/projected/b15e8745-fc1a-4575-ac07-e483f8e41c8d-kube-api-access-6v8l4\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058218 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-kubelet\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058288 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-etc-openvswitch\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058392 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plkpv\" (UniqueName: \"kubernetes.io/projected/e7f44620-97b4-4cdb-8252-d8a2971830fa-kube-api-access-plkpv\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058474 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-slash\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058571 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-node-log\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058645 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-systemd-units\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058716 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-systemd\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058789 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-script-lib\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058870 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-var-lib-openvswitch\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058930 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-log-socket\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.058993 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovn-node-metrics-cert\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.059060 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b15e8745-fc1a-4575-ac07-e483f8e41c8d-rootfs\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.059119 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b15e8745-fc1a-4575-ac07-e483f8e41c8d-proxy-tls\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.059239 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b15e8745-fc1a-4575-ac07-e483f8e41c8d-mcd-auth-proxy-config\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.059393 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.068103 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.076725 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.085201 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.091952 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.100348 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.111470 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.125900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.125937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.125949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.125964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.125975 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.126810 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.140509 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.150141 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.158888 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160215 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b15e8745-fc1a-4575-ac07-e483f8e41c8d-mcd-auth-proxy-config\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160252 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-openvswitch\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160341 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160421 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-config\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160443 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160420 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-openvswitch\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160451 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-env-overrides\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160562 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-ovn\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160595 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-bin\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160617 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-netd\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160637 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v8l4\" (UniqueName: \"kubernetes.io/projected/b15e8745-fc1a-4575-ac07-e483f8e41c8d-kube-api-access-6v8l4\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160672 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-netns\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160689 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160714 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-kubelet\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160733 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-etc-openvswitch\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160753 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-netns\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160780 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plkpv\" (UniqueName: \"kubernetes.io/projected/e7f44620-97b4-4cdb-8252-d8a2971830fa-kube-api-access-plkpv\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160839 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-slash\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160877 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-ovn\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160907 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-node-log\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160955 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-slash\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160821 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-netd\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.160882 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-node-log\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161005 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-bin\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161038 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161045 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-kubelet\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161050 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-systemd-units\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161076 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-etc-openvswitch\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161084 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-systemd\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161108 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-script-lib\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161080 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b15e8745-fc1a-4575-ac07-e483f8e41c8d-mcd-auth-proxy-config\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161130 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b15e8745-fc1a-4575-ac07-e483f8e41c8d-rootfs\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161127 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-systemd\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161110 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-systemd-units\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161180 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-var-lib-openvswitch\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161151 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-var-lib-openvswitch\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161221 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b15e8745-fc1a-4575-ac07-e483f8e41c8d-rootfs\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161236 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-config\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161249 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-log-socket\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161280 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovn-node-metrics-cert\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161305 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b15e8745-fc1a-4575-ac07-e483f8e41c8d-proxy-tls\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.161303 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-log-socket\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.162035 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-script-lib\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.162334 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-env-overrides\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.164160 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovn-node-metrics-cert\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.164249 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b15e8745-fc1a-4575-ac07-e483f8e41c8d-proxy-tls\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.169469 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.174442 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plkpv\" (UniqueName: \"kubernetes.io/projected/e7f44620-97b4-4cdb-8252-d8a2971830fa-kube-api-access-plkpv\") pod \"ovnkube-node-9dvt4\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.178099 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v8l4\" (UniqueName: \"kubernetes.io/projected/b15e8745-fc1a-4575-ac07-e483f8e41c8d-kube-api-access-6v8l4\") pod \"machine-config-daemon-xzb52\" (UID: \"b15e8745-fc1a-4575-ac07-e483f8e41c8d\") " pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.181161 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.191120 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.202258 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.221281 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.226483 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.228232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.228264 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.228276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.228292 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.228330 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: W1126 12:12:05.240871 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb15e8745_fc1a_4575_ac07_e483f8e41c8d.slice/crio-d20d9bf313ac1a331bbbd4cb931131e9443bf894aef28ac14c6c69f4e0594aaa WatchSource:0}: Error finding container d20d9bf313ac1a331bbbd4cb931131e9443bf894aef28ac14c6c69f4e0594aaa: Status 404 returned error can't find the container with id d20d9bf313ac1a331bbbd4cb931131e9443bf894aef28ac14c6c69f4e0594aaa Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.330420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.330470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.330486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.330512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.330530 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: E1126 12:12:05.352368 4834 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Nov 26 12:12:05 crc kubenswrapper[4834]: E1126 12:12:05.352454 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/234b786b-76dd-4238-81bd-a743042bece9-cni-binary-copy podName:234b786b-76dd-4238-81bd-a743042bece9 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:05.852434823 +0000 UTC m=+23.759648175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/234b786b-76dd-4238-81bd-a743042bece9-cni-binary-copy") pod "multus-k8hjt" (UID: "234b786b-76dd-4238-81bd-a743042bece9") : failed to sync configmap cache: timed out waiting for the condition Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.417028 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:05 crc kubenswrapper[4834]: E1126 12:12:05.417187 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.433276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.433659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.433672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.433697 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.433712 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.438389 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.447032 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7910fe0-c205-465a-b8b5-9b56d8bb1941-cni-binary-copy\") pod \"multus-additional-cni-plugins-q5s5d\" (UID: \"c7910fe0-c205-465a-b8b5-9b56d8bb1941\") " pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.456182 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" Nov 26 12:12:05 crc kubenswrapper[4834]: W1126 12:12:05.472981 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7910fe0_c205_465a_b8b5_9b56d8bb1941.slice/crio-139a94e9be47454bbbc3727ad42cfcc801ee022252476ac172aa0f9956af7f07 WatchSource:0}: Error finding container 139a94e9be47454bbbc3727ad42cfcc801ee022252476ac172aa0f9956af7f07: Status 404 returned error can't find the container with id 139a94e9be47454bbbc3727ad42cfcc801ee022252476ac172aa0f9956af7f07 Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.529962 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859" exitCode=0 Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.530050 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.530120 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"c47fc0dcaffee3586df9083cc357f917edd800c7f47f55a69389b47b37ffef17"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.532408 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5gvrf" event={"ID":"5a5b7690-00d4-4ca0-8d22-b236f5d25580","Type":"ContainerStarted","Data":"8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.534956 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" event={"ID":"c7910fe0-c205-465a-b8b5-9b56d8bb1941","Type":"ContainerStarted","Data":"139a94e9be47454bbbc3727ad42cfcc801ee022252476ac172aa0f9956af7f07"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.536637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.536687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.536704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.536729 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.536741 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.538618 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4rwmt" event={"ID":"b7c651b3-b6f5-4af8-9cc7-4728f137227a","Type":"ContainerStarted","Data":"dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.541916 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.541968 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.541985 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"d20d9bf313ac1a331bbbd4cb931131e9443bf894aef28ac14c6c69f4e0594aaa"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.550875 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.563493 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.577815 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.592914 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.601834 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.615003 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.625241 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.635564 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.639672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.639707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.639716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.639735 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.639746 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.647918 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.658399 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.669141 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.688193 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.705507 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.719228 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.735960 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.742814 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.742859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.742872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.742896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.742911 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.746881 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.762168 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.778370 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.792474 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.803352 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.814897 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.824274 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.845585 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.845638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.845648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.845670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.845683 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.851106 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.869047 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/234b786b-76dd-4238-81bd-a743042bece9-cni-binary-copy\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.870133 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/234b786b-76dd-4238-81bd-a743042bece9-cni-binary-copy\") pod \"multus-k8hjt\" (UID: \"234b786b-76dd-4238-81bd-a743042bece9\") " pod="openshift-multus/multus-k8hjt" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.891292 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.931292 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.949267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.949325 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.949336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.949359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.949372 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:05Z","lastTransitionTime":"2025-11-26T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.972906 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k8hjt" Nov 26 12:12:05 crc kubenswrapper[4834]: W1126 12:12:05.988255 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234b786b_76dd_4238_81bd_a743042bece9.slice/crio-1149eeed52db1fba102ca672f6413b77ed81c40b9e92458aed78f1f5d372bbee WatchSource:0}: Error finding container 1149eeed52db1fba102ca672f6413b77ed81c40b9e92458aed78f1f5d372bbee: Status 404 returned error can't find the container with id 1149eeed52db1fba102ca672f6413b77ed81c40b9e92458aed78f1f5d372bbee Nov 26 12:12:05 crc kubenswrapper[4834]: I1126 12:12:05.990009 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:05Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.044759 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.051803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.051840 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.051850 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.051870 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.051880 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.053622 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.155093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.155129 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.155138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.155154 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.155167 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.257829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.257886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.257901 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.257922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.257935 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.360338 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.360588 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.360599 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.360616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.360627 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.390047 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.393202 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.403457 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.412742 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.416859 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:06 crc kubenswrapper[4834]: E1126 12:12:06.416958 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.417380 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:06 crc kubenswrapper[4834]: E1126 12:12:06.417495 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.422978 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.436636 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.449770 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.458704 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.462290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.462360 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.462370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.462385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.462395 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.467638 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.478563 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.488435 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.498085 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.506981 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.519730 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.547613 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7910fe0-c205-465a-b8b5-9b56d8bb1941" containerID="880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca" exitCode=0 Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.547706 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" event={"ID":"c7910fe0-c205-465a-b8b5-9b56d8bb1941","Type":"ContainerDied","Data":"880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.550078 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k8hjt" event={"ID":"234b786b-76dd-4238-81bd-a743042bece9","Type":"ContainerStarted","Data":"a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.550121 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k8hjt" event={"ID":"234b786b-76dd-4238-81bd-a743042bece9","Type":"ContainerStarted","Data":"1149eeed52db1fba102ca672f6413b77ed81c40b9e92458aed78f1f5d372bbee"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.552133 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.558632 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.558670 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.558686 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.558697 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.558708 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.558723 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.565485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.565519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.565530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.565547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.565562 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.593176 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.629766 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.670153 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.670197 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.670211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.670231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.670245 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.671181 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.708430 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.750488 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.773543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.773584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.773594 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.773613 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.773925 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.788059 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.830099 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.873056 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.878124 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.878239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.878300 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.878397 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.878469 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.911377 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.950613 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.981122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.981226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.981295 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.981393 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.981495 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:06Z","lastTransitionTime":"2025-11-26T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:06 crc kubenswrapper[4834]: I1126 12:12:06.989562 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:06Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.031100 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.070501 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.084425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.084456 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.084468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.084486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.084502 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:07Z","lastTransitionTime":"2025-11-26T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.115556 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.156007 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.186886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.186941 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.186956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.186979 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.186992 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:07Z","lastTransitionTime":"2025-11-26T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.189525 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.231278 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.289542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.289582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.289607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.289627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.289638 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:07Z","lastTransitionTime":"2025-11-26T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.393152 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.393219 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.393238 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.393264 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.393275 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:07Z","lastTransitionTime":"2025-11-26T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.416556 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:07 crc kubenswrapper[4834]: E1126 12:12:07.416726 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.495962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.496017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.496029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.496053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.496067 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:07Z","lastTransitionTime":"2025-11-26T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.565417 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7910fe0-c205-465a-b8b5-9b56d8bb1941" containerID="76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f" exitCode=0 Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.565494 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" event={"ID":"c7910fe0-c205-465a-b8b5-9b56d8bb1941","Type":"ContainerDied","Data":"76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f"} Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.584886 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.596398 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.598600 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.598636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.598648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.598666 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.598678 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:07Z","lastTransitionTime":"2025-11-26T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.610244 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.625770 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.636435 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.646685 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.656900 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.667183 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.675656 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.689651 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.699995 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.701390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.701431 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.701443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.701464 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.701475 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:07Z","lastTransitionTime":"2025-11-26T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.711456 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.752241 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.789777 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.803773 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.803823 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.803834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.803856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.803869 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:07Z","lastTransitionTime":"2025-11-26T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.829775 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:07Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.906237 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.906292 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.906303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.906346 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:07 crc kubenswrapper[4834]: I1126 12:12:07.906361 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:07Z","lastTransitionTime":"2025-11-26T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.008867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.008903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.008911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.008927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.008936 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.090830 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.091052 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:12:16.091021742 +0000 UTC m=+33.998235094 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.114432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.114515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.114533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.114560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.114579 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.191735 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.191790 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.191813 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.191842 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.191928 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.191968 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.191976 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.192040 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.192051 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.191996 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.192063 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.192074 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.191979 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:16.191963525 +0000 UTC m=+34.099176877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.192127 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:16.192108616 +0000 UTC m=+34.099321968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.192148 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:16.192141537 +0000 UTC m=+34.099354889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.192164 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:16.192160072 +0000 UTC m=+34.099373423 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.216727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.216766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.216777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.216793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.216802 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.319374 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.319616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.319691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.319781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.319838 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.416892 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.416944 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.417412 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:08 crc kubenswrapper[4834]: E1126 12:12:08.417588 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.422147 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.422199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.422210 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.422230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.422245 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.524096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.524125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.524136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.524151 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.524162 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.570618 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7910fe0-c205-465a-b8b5-9b56d8bb1941" containerID="5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3" exitCode=0 Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.570724 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" event={"ID":"c7910fe0-c205-465a-b8b5-9b56d8bb1941","Type":"ContainerDied","Data":"5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.575923 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.581669 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.593055 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.605655 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.615366 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.625431 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.627019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.627070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.627086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.627106 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.627123 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.634744 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.644142 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.654577 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.663530 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.674016 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.681249 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.694970 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.703331 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.713726 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.729972 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:08Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.730197 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.730221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.730231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.730247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.730259 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.833248 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.833291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.833302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.833340 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.833353 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.936017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.936049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.936060 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.936074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:08 crc kubenswrapper[4834]: I1126 12:12:08.936083 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:08Z","lastTransitionTime":"2025-11-26T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.038334 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.038373 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.038384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.038405 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.038419 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.140836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.140890 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.140903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.140924 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.140939 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.243587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.243623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.243633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.243670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.243681 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.345953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.345984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.345994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.346007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.346016 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.416728 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:09 crc kubenswrapper[4834]: E1126 12:12:09.416877 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.448243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.448278 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.448290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.448305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.448339 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.550877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.550926 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.550938 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.550956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.550970 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.582439 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7910fe0-c205-465a-b8b5-9b56d8bb1941" containerID="d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626" exitCode=0 Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.582488 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" event={"ID":"c7910fe0-c205-465a-b8b5-9b56d8bb1941","Type":"ContainerDied","Data":"d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.605156 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.616892 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.629454 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.645244 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.652842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.652879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.652899 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.652914 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.652927 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.656042 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.667254 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.678142 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.687537 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.697204 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.706343 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.716352 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.726965 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.736197 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.745074 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.755645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.755676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.755686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.755713 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.755726 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.756112 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:09Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.858707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.858740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.858751 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.858765 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.858776 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.961193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.961228 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.961241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.961253 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:09 crc kubenswrapper[4834]: I1126 12:12:09.961265 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:09Z","lastTransitionTime":"2025-11-26T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.063776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.063822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.063832 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.063851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.063864 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.167431 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.167817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.167831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.167851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.167865 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.270354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.270390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.270400 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.270417 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.270428 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.374774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.374830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.374845 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.374864 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.374879 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.416051 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.416067 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:10 crc kubenswrapper[4834]: E1126 12:12:10.416190 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:10 crc kubenswrapper[4834]: E1126 12:12:10.416253 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.477776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.477810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.477824 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.477841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.477852 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.580063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.580099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.580112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.580130 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.580141 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.588717 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7910fe0-c205-465a-b8b5-9b56d8bb1941" containerID="a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6" exitCode=0 Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.588776 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" event={"ID":"c7910fe0-c205-465a-b8b5-9b56d8bb1941","Type":"ContainerDied","Data":"a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.606378 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.618357 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.631958 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.647999 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.660888 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.672587 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.683620 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.683668 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.683694 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.683721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.683736 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.683862 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.695377 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.706853 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.717141 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.728325 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.741436 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.752457 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.763291 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.773277 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:10Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.786166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.786193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.786203 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.786222 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.786233 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.888995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.889261 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.889274 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.889296 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.889325 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.992039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.992077 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.992086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.992101 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:10 crc kubenswrapper[4834]: I1126 12:12:10.992116 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:10Z","lastTransitionTime":"2025-11-26T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.094269 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.094328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.094341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.094358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.094367 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:11Z","lastTransitionTime":"2025-11-26T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.196922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.196966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.196976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.196993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.197004 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:11Z","lastTransitionTime":"2025-11-26T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.299700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.299733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.299743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.299757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.299766 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:11Z","lastTransitionTime":"2025-11-26T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.402601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.402654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.402674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.402694 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.402705 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:11Z","lastTransitionTime":"2025-11-26T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.416993 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:11 crc kubenswrapper[4834]: E1126 12:12:11.417158 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.505237 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.505286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.505297 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.505334 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.505347 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:11Z","lastTransitionTime":"2025-11-26T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.595111 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7910fe0-c205-465a-b8b5-9b56d8bb1941" containerID="1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a" exitCode=0 Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.595187 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" event={"ID":"c7910fe0-c205-465a-b8b5-9b56d8bb1941","Type":"ContainerDied","Data":"1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.601629 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.601952 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.602025 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.608272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.608304 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.608334 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.608351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.608364 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:11Z","lastTransitionTime":"2025-11-26T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.609656 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.623398 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.627294 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.627361 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.633499 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.643230 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.654261 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.665089 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.675264 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.685035 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.696031 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.705590 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.711922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.711972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.711988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.712009 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.712027 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:11Z","lastTransitionTime":"2025-11-26T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.715143 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.725394 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.736180 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.749001 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.763365 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.776596 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.784182 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.795017 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.807843 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.813828 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.813863 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.813873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.813888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.813897 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:11Z","lastTransitionTime":"2025-11-26T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.819479 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.829621 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.839218 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.849733 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.857711 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.867834 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.876194 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.886386 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.895894 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.904970 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.912779 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:11Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.916864 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.916907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.916920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.916940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:11 crc kubenswrapper[4834]: I1126 12:12:11.916954 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:11Z","lastTransitionTime":"2025-11-26T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.019367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.019396 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.019406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.019425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.019436 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.121933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.121996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.122010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.122040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.122052 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.224090 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.224127 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.224138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.224156 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.224167 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.327251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.327301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.327327 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.327348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.327359 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.416760 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:12 crc kubenswrapper[4834]: E1126 12:12:12.416942 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.416981 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:12 crc kubenswrapper[4834]: E1126 12:12:12.417053 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.430092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.430145 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.430156 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.430173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.430187 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.430781 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.441430 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.452698 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.460872 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.476057 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.485493 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.497188 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.514112 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.525931 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.532918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.533003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.533014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.533038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.533058 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.537790 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.547912 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.556752 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.565139 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.573890 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.582074 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.608351 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" event={"ID":"c7910fe0-c205-465a-b8b5-9b56d8bb1941","Type":"ContainerStarted","Data":"546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.608407 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.623806 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.632684 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.635086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.635131 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.635141 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.635157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.635167 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.646997 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.663393 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.673965 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.684591 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.695869 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.707525 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.719459 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.729183 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.737952 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.738000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.738012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.738034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.738048 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.741712 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.750254 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.761657 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.771767 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.797030 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.840238 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.840270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.840279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.840293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.840304 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.942146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.942179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.942190 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.942205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:12 crc kubenswrapper[4834]: I1126 12:12:12.942215 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:12Z","lastTransitionTime":"2025-11-26T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.046073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.046523 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.046554 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.046585 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.046597 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.149376 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.149445 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.149458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.149481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.149494 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.251701 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.251748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.251760 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.251780 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.251792 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.354522 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.354579 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.354594 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.354616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.354638 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.416677 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:13 crc kubenswrapper[4834]: E1126 12:12:13.416826 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.421067 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.421121 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.421143 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.421164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.421176 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: E1126 12:12:13.432016 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.434874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.434905 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.434916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.434931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.434941 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: E1126 12:12:13.444540 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.447757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.447795 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.447813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.447831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.447843 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: E1126 12:12:13.458176 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.460996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.461017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.461025 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.461038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.461046 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: E1126 12:12:13.470140 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.472805 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.472846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.472857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.472873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.472887 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: E1126 12:12:13.481551 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: E1126 12:12:13.481684 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.483110 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.483152 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.483166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.483187 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.483201 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.585370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.585414 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.585427 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.585443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.585456 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.613301 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/0.log" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.616722 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec" exitCode=1 Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.616817 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.617785 4834 scope.go:117] "RemoveContainer" containerID="93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.629720 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.642113 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.661003 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"message\\\":\\\"reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 12:12:12.626455 6122 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 12:12:12.626803 6122 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 12:12:12.626812 6122 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 12:12:12.626847 6122 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1126 12:12:12.626863 6122 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 12:12:12.626873 6122 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 12:12:12.626877 6122 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 12:12:12.626882 6122 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 12:12:12.626887 6122 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 12:12:12.626905 6122 factory.go:656] Stopping watch factory\\\\nI1126 12:12:12.626916 6122 ovnkube.go:599] Stopped ovnkube\\\\nI1126 12:12:12.626931 6122 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 12:12:12.626935 6122 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 12:12:12.626947 6122 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.681640 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.687557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.687591 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.687602 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.687628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.687642 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.693934 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.705090 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.717539 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.727677 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.740199 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.750359 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.763368 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.775223 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.785577 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.790386 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.790415 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.790426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.790443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.790454 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.794770 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.802552 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:13Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.892744 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.892781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.892793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.892810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.892819 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.995345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.995396 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.995410 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.995432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:13 crc kubenswrapper[4834]: I1126 12:12:13.995444 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:13Z","lastTransitionTime":"2025-11-26T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.098533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.098587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.098598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.098628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.098639 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:14Z","lastTransitionTime":"2025-11-26T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.201164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.201204 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.201212 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.201229 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.201246 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:14Z","lastTransitionTime":"2025-11-26T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.303486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.303518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.303539 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.303552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.303561 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:14Z","lastTransitionTime":"2025-11-26T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.406836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.406891 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.406908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.406928 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.406940 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:14Z","lastTransitionTime":"2025-11-26T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.417065 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.417194 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:14 crc kubenswrapper[4834]: E1126 12:12:14.417848 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:14 crc kubenswrapper[4834]: E1126 12:12:14.418017 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.509121 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.509168 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.509179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.509198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.509209 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:14Z","lastTransitionTime":"2025-11-26T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.611720 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.611768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.611783 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.611801 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.611814 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:14Z","lastTransitionTime":"2025-11-26T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.621396 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/1.log" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.621957 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/0.log" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.624229 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1" exitCode=1 Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.624275 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.624352 4834 scope.go:117] "RemoveContainer" containerID="93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.624862 4834 scope.go:117] "RemoveContainer" containerID="ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1" Nov 26 12:12:14 crc kubenswrapper[4834]: E1126 12:12:14.624997 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.642018 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.651043 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.663439 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.676783 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93a993bc42d710c104df5fed23989e079fc39935981528c3ee53b64239ecdfec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"message\\\":\\\"reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 12:12:12.626455 6122 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1126 12:12:12.626803 6122 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 12:12:12.626812 6122 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 12:12:12.626847 6122 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1126 12:12:12.626863 6122 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 12:12:12.626873 6122 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 12:12:12.626877 6122 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 12:12:12.626882 6122 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 12:12:12.626887 6122 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 12:12:12.626905 6122 factory.go:656] Stopping watch factory\\\\nI1126 12:12:12.626916 6122 ovnkube.go:599] Stopped ovnkube\\\\nI1126 12:12:12.626931 6122 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 12:12:12.626935 6122 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 12:12:12.626947 6122 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:14Z\\\",\\\"message\\\":\\\"6270 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-5gvrf in node crc\\\\nI1126 12:12:14.293226 6270 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-5gvrf after 0 failed attempt(s)\\\\nI1126 12:12:14.293229 6270 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-5gvrf\\\\nI1126 12:12:14.293236 6270 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293240 6270 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293245 6270 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF1126 12:12:14.293246 6270 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.687028 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.695960 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.704762 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.713963 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.714299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.714349 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.714362 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.714381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.714393 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:14Z","lastTransitionTime":"2025-11-26T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.722601 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.732596 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.741585 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.751927 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.761422 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.772101 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.780299 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:14Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.816975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.817011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.817021 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.817040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.817051 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:14Z","lastTransitionTime":"2025-11-26T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.919379 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.919409 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.919419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.919435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:14 crc kubenswrapper[4834]: I1126 12:12:14.919448 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:14Z","lastTransitionTime":"2025-11-26T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.021673 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.021702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.021711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.021724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.021736 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.123587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.123625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.123633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.123644 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.123653 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.225724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.225762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.225774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.225793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.225805 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.327616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.327766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.327866 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.327954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.328031 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.416284 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:15 crc kubenswrapper[4834]: E1126 12:12:15.416686 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.431961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.432018 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.432031 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.432055 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.432076 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.534721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.534849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.534919 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.534984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.535036 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.630148 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/1.log" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.635043 4834 scope.go:117] "RemoveContainer" containerID="ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1" Nov 26 12:12:15 crc kubenswrapper[4834]: E1126 12:12:15.635220 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.637812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.637846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.637857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.637870 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.637882 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.648063 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.660715 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.672360 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.681691 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.691054 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.698790 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.707572 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.715757 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.723419 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.730861 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.737028 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.739470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.739571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.739653 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.739712 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.739768 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.750384 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.758088 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.767805 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.784058 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:14Z\\\",\\\"message\\\":\\\"6270 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-5gvrf in node crc\\\\nI1126 12:12:14.293226 6270 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-5gvrf after 0 failed attempt(s)\\\\nI1126 12:12:14.293229 6270 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-5gvrf\\\\nI1126 12:12:14.293236 6270 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293240 6270 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293245 6270 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF1126 12:12:14.293246 6270 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.842265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.842336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.842351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.842371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.842382 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.951262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.951321 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.951332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.951351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:15 crc kubenswrapper[4834]: I1126 12:12:15.951361 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:15Z","lastTransitionTime":"2025-11-26T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.053683 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.053731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.053744 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.053765 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.053778 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.156693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.156737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.156749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.156768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.156782 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.174342 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.174508 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:12:32.174485282 +0000 UTC m=+50.081698634 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.259062 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.259105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.259116 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.259134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.259146 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.275619 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.275686 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.275717 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.275739 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.275789 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.275870 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.275882 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:32.275840867 +0000 UTC m=+50.183054219 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.275890 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.275906 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.275929 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.275954 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:32.275941845 +0000 UTC m=+50.183155198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.275967 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.275990 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.276070 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:32.276046631 +0000 UTC m=+50.183259983 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.276168 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.276209 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:32.27620122 +0000 UTC m=+50.183414572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.361600 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.361645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.361657 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.361675 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.361687 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.386412 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb"] Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.387065 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.389475 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.389982 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.400863 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.417874 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.418148 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.418212 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:16 crc kubenswrapper[4834]: E1126 12:12:16.418268 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.418356 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.429476 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.442009 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.450047 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.460141 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.463324 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.463354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.463364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.463381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.463403 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.468880 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.476489 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.477858 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-env-overrides\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.477899 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.477948 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk8qk\" (UniqueName: \"kubernetes.io/projected/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-kube-api-access-qk8qk\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.478073 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.484271 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.491121 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.499998 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.507168 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.516111 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.539124 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:14Z\\\",\\\"message\\\":\\\"6270 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-5gvrf in node crc\\\\nI1126 12:12:14.293226 6270 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-5gvrf after 0 failed attempt(s)\\\\nI1126 12:12:14.293229 6270 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-5gvrf\\\\nI1126 12:12:14.293236 6270 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293240 6270 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293245 6270 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF1126 12:12:14.293246 6270 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.553073 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.561258 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:16Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.565888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.565944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.565957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.565978 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.565988 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.579368 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk8qk\" (UniqueName: \"kubernetes.io/projected/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-kube-api-access-qk8qk\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.579407 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.579443 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-env-overrides\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.579468 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.580040 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-env-overrides\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.580188 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.585474 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.593235 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk8qk\" (UniqueName: \"kubernetes.io/projected/9d6a40a1-4fc5-447c-9f19-ce50904ebaaa-kube-api-access-qk8qk\") pod \"ovnkube-control-plane-749d76644c-br5pb\" (UID: \"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.668385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.668426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.668439 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.668458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.668468 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.697797 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" Nov 26 12:12:16 crc kubenswrapper[4834]: W1126 12:12:16.715244 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d6a40a1_4fc5_447c_9f19_ce50904ebaaa.slice/crio-de172871f50a9f900595d8a0d9b8d744929af07a25ea41654142896b5c143d37 WatchSource:0}: Error finding container de172871f50a9f900595d8a0d9b8d744929af07a25ea41654142896b5c143d37: Status 404 returned error can't find the container with id de172871f50a9f900595d8a0d9b8d744929af07a25ea41654142896b5c143d37 Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.773342 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.773625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.773636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.773653 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.773664 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.875809 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.875850 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.875862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.875878 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.875889 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.977880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.977925 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.977935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.977951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:16 crc kubenswrapper[4834]: I1126 12:12:16.977962 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:16Z","lastTransitionTime":"2025-11-26T12:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.080654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.080706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.080716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.080733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.080745 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:17Z","lastTransitionTime":"2025-11-26T12:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.182484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.182524 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.182533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.182547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.182557 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:17Z","lastTransitionTime":"2025-11-26T12:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.200710 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.213584 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.224261 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.234236 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.246652 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:14Z\\\",\\\"message\\\":\\\"6270 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-5gvrf in node crc\\\\nI1126 12:12:14.293226 6270 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-5gvrf after 0 failed attempt(s)\\\\nI1126 12:12:14.293229 6270 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-5gvrf\\\\nI1126 12:12:14.293236 6270 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293240 6270 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293245 6270 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF1126 12:12:14.293246 6270 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.257055 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.266057 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.279508 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.285188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.285240 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.285252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.285273 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.285285 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:17Z","lastTransitionTime":"2025-11-26T12:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.288821 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.296160 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.304979 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.314584 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.323859 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.332222 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.344416 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.351827 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.360218 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.387919 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.387957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.387971 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.387988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.387999 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:17Z","lastTransitionTime":"2025-11-26T12:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.416411 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:17 crc kubenswrapper[4834]: E1126 12:12:17.416535 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.489976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.490002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.490012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.490026 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.490055 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:17Z","lastTransitionTime":"2025-11-26T12:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.592180 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.592223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.592232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.592252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.592263 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:17Z","lastTransitionTime":"2025-11-26T12:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.642905 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" event={"ID":"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa","Type":"ContainerStarted","Data":"d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.642964 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" event={"ID":"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa","Type":"ContainerStarted","Data":"8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.642978 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" event={"ID":"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa","Type":"ContainerStarted","Data":"de172871f50a9f900595d8a0d9b8d744929af07a25ea41654142896b5c143d37"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.661099 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.680507 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.690980 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.694164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.694194 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.694205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.694217 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.694225 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:17Z","lastTransitionTime":"2025-11-26T12:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.703076 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:14Z\\\",\\\"message\\\":\\\"6270 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-5gvrf in node crc\\\\nI1126 12:12:14.293226 6270 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-5gvrf after 0 failed attempt(s)\\\\nI1126 12:12:14.293229 6270 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-5gvrf\\\\nI1126 12:12:14.293236 6270 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293240 6270 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293245 6270 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF1126 12:12:14.293246 6270 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.712048 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.720682 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.729689 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.739389 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.747844 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.756765 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.764954 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.775035 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.783359 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.791962 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.796969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.797005 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.797017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.797035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.797047 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:17Z","lastTransitionTime":"2025-11-26T12:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.799814 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.808700 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:17Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.900329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.900376 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.900385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.900404 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:17 crc kubenswrapper[4834]: I1126 12:12:17.900414 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:17Z","lastTransitionTime":"2025-11-26T12:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.002843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.002885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.002895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.002912 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.002921 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.105177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.105233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.105243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.105256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.105266 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.189603 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-tmlsw"] Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.190289 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:18 crc kubenswrapper[4834]: E1126 12:12:18.190384 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.202046 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.207662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.207691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.207702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.207718 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.207728 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.211749 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.222129 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.234349 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.243639 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.252713 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.266477 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.275185 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.284988 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.295810 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvfrf\" (UniqueName: \"kubernetes.io/projected/6feada4f-ea0c-4062-ab87-ff88a4590c96-kube-api-access-hvfrf\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.295850 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.297815 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:14Z\\\",\\\"message\\\":\\\"6270 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-5gvrf in node crc\\\\nI1126 12:12:14.293226 6270 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-5gvrf after 0 failed attempt(s)\\\\nI1126 12:12:14.293229 6270 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-5gvrf\\\\nI1126 12:12:14.293236 6270 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293240 6270 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293245 6270 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF1126 12:12:14.293246 6270 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.305406 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.309612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.309650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.309661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.309677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.309688 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.316191 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.325042 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.334434 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.342900 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.349873 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.358663 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:18Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.396443 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvfrf\" (UniqueName: \"kubernetes.io/projected/6feada4f-ea0c-4062-ab87-ff88a4590c96-kube-api-access-hvfrf\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.396489 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:18 crc kubenswrapper[4834]: E1126 12:12:18.396630 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:18 crc kubenswrapper[4834]: E1126 12:12:18.396678 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs podName:6feada4f-ea0c-4062-ab87-ff88a4590c96 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:18.896665382 +0000 UTC m=+36.803878733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs") pod "network-metrics-daemon-tmlsw" (UID: "6feada4f-ea0c-4062-ab87-ff88a4590c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.412219 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.412256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.412266 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.412283 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.412293 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.416384 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.416394 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:18 crc kubenswrapper[4834]: E1126 12:12:18.416498 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:18 crc kubenswrapper[4834]: E1126 12:12:18.416610 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.422014 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvfrf\" (UniqueName: \"kubernetes.io/projected/6feada4f-ea0c-4062-ab87-ff88a4590c96-kube-api-access-hvfrf\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.515190 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.515251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.515264 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.515278 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.515288 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.618303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.618365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.618375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.618394 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.618404 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.720592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.720650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.720664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.720688 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.720701 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.823745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.823799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.823811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.823830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.823842 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.901463 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:18 crc kubenswrapper[4834]: E1126 12:12:18.901595 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:18 crc kubenswrapper[4834]: E1126 12:12:18.901668 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs podName:6feada4f-ea0c-4062-ab87-ff88a4590c96 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:19.901647693 +0000 UTC m=+37.808861055 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs") pod "network-metrics-daemon-tmlsw" (UID: "6feada4f-ea0c-4062-ab87-ff88a4590c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.926419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.926462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.926480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.926498 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:18 crc kubenswrapper[4834]: I1126 12:12:18.926507 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:18Z","lastTransitionTime":"2025-11-26T12:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.029543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.029594 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.029604 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.029621 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.029634 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.132199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.132249 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.132261 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.132281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.132294 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.233913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.233951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.233960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.233972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.233985 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.336887 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.336933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.336943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.336957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.336967 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.416824 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:19 crc kubenswrapper[4834]: E1126 12:12:19.416987 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.439593 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.439643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.439653 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.439671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.439685 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.542279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.542354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.542366 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.542392 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.542408 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.645417 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.645472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.645486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.645507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.646452 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.748174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.748232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.748242 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.748268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.748278 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.850874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.850924 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.850934 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.850951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.850964 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.910433 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:19 crc kubenswrapper[4834]: E1126 12:12:19.910571 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:19 crc kubenswrapper[4834]: E1126 12:12:19.910652 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs podName:6feada4f-ea0c-4062-ab87-ff88a4590c96 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:21.910630858 +0000 UTC m=+39.817844211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs") pod "network-metrics-daemon-tmlsw" (UID: "6feada4f-ea0c-4062-ab87-ff88a4590c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.953781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.953827 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.953836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.953856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:19 crc kubenswrapper[4834]: I1126 12:12:19.953867 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:19Z","lastTransitionTime":"2025-11-26T12:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.055730 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.055777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.055787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.055802 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.055814 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.158260 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.158299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.158328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.158342 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.158352 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.260544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.260589 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.260598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.260613 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.260623 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.363105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.363148 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.363160 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.363174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.363183 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.416989 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.417040 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.416998 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:20 crc kubenswrapper[4834]: E1126 12:12:20.417121 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:20 crc kubenswrapper[4834]: E1126 12:12:20.417272 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:20 crc kubenswrapper[4834]: E1126 12:12:20.417468 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.465009 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.465050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.465059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.465076 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.465089 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.567128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.567178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.567191 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.567202 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.567212 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.668990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.669036 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.669045 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.669057 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.669066 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.770736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.770779 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.770788 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.770801 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.770811 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.872911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.872948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.872957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.872970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.872979 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.975671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.975713 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.975723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.975736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:20 crc kubenswrapper[4834]: I1126 12:12:20.975745 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:20Z","lastTransitionTime":"2025-11-26T12:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.078053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.078099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.078108 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.078126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.078138 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.180229 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.180270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.180280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.180294 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.180304 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.282602 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.282646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.282655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.282668 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.282677 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.384774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.384835 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.384845 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.384861 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.384870 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.416388 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:21 crc kubenswrapper[4834]: E1126 12:12:21.416518 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.486835 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.486884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.486896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.486906 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.486913 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.588929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.588975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.588986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.589004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.589015 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.690841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.690871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.690898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.690910 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.690920 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.792612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.792655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.792664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.792678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.792687 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.895079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.895121 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.895132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.895144 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.895155 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.928903 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:21 crc kubenswrapper[4834]: E1126 12:12:21.929124 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:21 crc kubenswrapper[4834]: E1126 12:12:21.929249 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs podName:6feada4f-ea0c-4062-ab87-ff88a4590c96 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:25.929223963 +0000 UTC m=+43.836437315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs") pod "network-metrics-daemon-tmlsw" (UID: "6feada4f-ea0c-4062-ab87-ff88a4590c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.996571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.996667 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.996681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.996696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:21 crc kubenswrapper[4834]: I1126 12:12:21.996704 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:21Z","lastTransitionTime":"2025-11-26T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.100387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.100430 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.100441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.100459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.100471 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:22Z","lastTransitionTime":"2025-11-26T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.202849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.202904 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.202916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.202940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.202951 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:22Z","lastTransitionTime":"2025-11-26T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.304731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.304785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.304795 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.304815 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.304828 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:22Z","lastTransitionTime":"2025-11-26T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.407053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.407326 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.407337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.407352 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.407361 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:22Z","lastTransitionTime":"2025-11-26T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.416466 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.416498 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:22 crc kubenswrapper[4834]: E1126 12:12:22.416576 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:22 crc kubenswrapper[4834]: E1126 12:12:22.416693 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.416848 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:22 crc kubenswrapper[4834]: E1126 12:12:22.416981 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.428029 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.443701 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.453554 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.466130 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.483898 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:14Z\\\",\\\"message\\\":\\\"6270 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-5gvrf in node crc\\\\nI1126 12:12:14.293226 6270 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-5gvrf after 0 failed attempt(s)\\\\nI1126 12:12:14.293229 6270 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-5gvrf\\\\nI1126 12:12:14.293236 6270 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293240 6270 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293245 6270 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF1126 12:12:14.293246 6270 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.493335 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.502453 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.509551 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.509586 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.509597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.509614 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.509627 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:22Z","lastTransitionTime":"2025-11-26T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.513191 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.522097 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.530687 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.539636 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.546846 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.554078 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.563939 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.572250 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.581491 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.592009 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:22Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.611746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.611788 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.611799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.611816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.611827 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:22Z","lastTransitionTime":"2025-11-26T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.714438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.714497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.714511 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.714529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.714542 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:22Z","lastTransitionTime":"2025-11-26T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.817215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.817252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.817263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.817276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.817287 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:22Z","lastTransitionTime":"2025-11-26T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.919628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.919675 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.919687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.919703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:22 crc kubenswrapper[4834]: I1126 12:12:22.919715 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:22Z","lastTransitionTime":"2025-11-26T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.021673 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.021730 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.021743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.021762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.021778 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.123879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.123942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.123953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.123977 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.123991 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.226045 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.226088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.226097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.226112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.226122 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.328670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.328718 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.328728 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.328743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.328754 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.416472 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:23 crc kubenswrapper[4834]: E1126 12:12:23.416586 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.430782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.430812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.430822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.430836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.430847 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.533360 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.533424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.533434 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.533475 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.533492 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.564558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.564612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.564625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.564641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.564652 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: E1126 12:12:23.574793 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:23Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.577537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.577587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.577598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.577613 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.577623 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: E1126 12:12:23.585959 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:23Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.588013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.588043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.588053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.588070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.588078 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: E1126 12:12:23.596043 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:23Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.599263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.599289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.599300 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.599332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.599341 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: E1126 12:12:23.607732 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:23Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.610283 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.610337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.610348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.610362 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.610372 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: E1126 12:12:23.618653 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:23Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:23 crc kubenswrapper[4834]: E1126 12:12:23.618777 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.636059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.636102 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.636112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.636126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.636136 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.737922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.737970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.737980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.737994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.738005 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.840881 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.840935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.840945 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.840964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.840976 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.943860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.943932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.943946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.943976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:23 crc kubenswrapper[4834]: I1126 12:12:23.944008 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:23Z","lastTransitionTime":"2025-11-26T12:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.047114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.047181 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.047194 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.047211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.047222 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.149885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.149947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.149957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.149994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.150005 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.251589 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.251656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.251668 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.251680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.251690 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.353293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.353359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.353370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.353387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.353397 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.416390 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.416438 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.416471 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:24 crc kubenswrapper[4834]: E1126 12:12:24.416590 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:24 crc kubenswrapper[4834]: E1126 12:12:24.416688 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:24 crc kubenswrapper[4834]: E1126 12:12:24.416779 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.455734 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.455779 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.455790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.455818 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.455832 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.558023 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.558075 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.558086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.558108 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.558119 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.660603 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.660638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.660648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.660660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.660670 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.762504 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.762533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.762542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.762553 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.762565 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.864688 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.865049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.865136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.865207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.865268 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.967661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.967707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.967717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.967732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:24 crc kubenswrapper[4834]: I1126 12:12:24.967743 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:24Z","lastTransitionTime":"2025-11-26T12:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.069626 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.069670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.069682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.069698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.069710 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.171860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.171946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.171960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.171985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.172000 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.274015 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.274069 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.274087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.274107 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.274119 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.376554 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.376609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.376622 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.376638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.376652 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.416811 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:25 crc kubenswrapper[4834]: E1126 12:12:25.416991 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.479410 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.479470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.479483 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.479498 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.479508 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.582050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.582101 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.582111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.582128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.582142 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.683415 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.683472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.683481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.683496 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.683507 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.785343 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.785381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.785391 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.785408 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.785425 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.888563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.888603 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.888612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.888631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.888642 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.969539 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:25 crc kubenswrapper[4834]: E1126 12:12:25.969733 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:25 crc kubenswrapper[4834]: E1126 12:12:25.969828 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs podName:6feada4f-ea0c-4062-ab87-ff88a4590c96 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:33.969803981 +0000 UTC m=+51.877017343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs") pod "network-metrics-daemon-tmlsw" (UID: "6feada4f-ea0c-4062-ab87-ff88a4590c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.991137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.991169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.991178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.991191 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:25 crc kubenswrapper[4834]: I1126 12:12:25.991201 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:25Z","lastTransitionTime":"2025-11-26T12:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.093252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.093303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.093333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.093351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.093361 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:26Z","lastTransitionTime":"2025-11-26T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.195584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.195626 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.195635 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.195654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.195664 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:26Z","lastTransitionTime":"2025-11-26T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.297765 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.297791 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.297799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.297812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.297820 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:26Z","lastTransitionTime":"2025-11-26T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.400227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.400267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.400275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.400290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.400303 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:26Z","lastTransitionTime":"2025-11-26T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.417490 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.417513 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:26 crc kubenswrapper[4834]: E1126 12:12:26.417607 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.417687 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:26 crc kubenswrapper[4834]: E1126 12:12:26.417812 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:26 crc kubenswrapper[4834]: E1126 12:12:26.417871 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.502419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.502473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.502484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.502503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.502517 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:26Z","lastTransitionTime":"2025-11-26T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.604622 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.604685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.604697 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.604721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.604735 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:26Z","lastTransitionTime":"2025-11-26T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.707115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.707151 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.707163 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.707177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.707185 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:26Z","lastTransitionTime":"2025-11-26T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.809921 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.809962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.809973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.809985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.809994 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:26Z","lastTransitionTime":"2025-11-26T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.913327 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.913370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.913383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.913395 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:26 crc kubenswrapper[4834]: I1126 12:12:26.913417 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:26Z","lastTransitionTime":"2025-11-26T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.015926 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.015981 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.015990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.016008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.016019 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.118368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.118434 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.118446 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.118463 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.118477 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.220825 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.220862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.220871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.220883 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.220892 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.323723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.323800 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.323813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.323830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.323841 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.416772 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:27 crc kubenswrapper[4834]: E1126 12:12:27.417190 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.417444 4834 scope.go:117] "RemoveContainer" containerID="ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.425989 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.426039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.426051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.426066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.426079 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.528984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.529015 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.529025 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.529039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.529051 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.631292 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.631345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.631355 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.631373 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.631384 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.674092 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/1.log" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.676941 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.677080 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.689884 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.705982 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.720034 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.730470 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.733733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.733760 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.733770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.733784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.733794 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.748057 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.767358 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.789900 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.801041 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.811887 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.824464 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:14Z\\\",\\\"message\\\":\\\"6270 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-5gvrf in node crc\\\\nI1126 12:12:14.293226 6270 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-5gvrf after 0 failed attempt(s)\\\\nI1126 12:12:14.293229 6270 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-5gvrf\\\\nI1126 12:12:14.293236 6270 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293240 6270 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293245 6270 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF1126 12:12:14.293246 6270 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.832073 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.835527 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.835567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.835576 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.835591 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.835601 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.841105 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.850051 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.859882 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.871835 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.882774 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.892865 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:27Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.937527 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.937558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.937567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.937580 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:27 crc kubenswrapper[4834]: I1126 12:12:27.937589 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:27Z","lastTransitionTime":"2025-11-26T12:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.039413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.039770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.039779 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.039795 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.039807 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.142690 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.142756 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.142768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.142789 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.142799 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.246146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.246361 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.246448 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.246526 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.246581 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.349231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.349267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.349276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.349290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.349299 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.416534 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.416557 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.416653 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:28 crc kubenswrapper[4834]: E1126 12:12:28.417006 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:28 crc kubenswrapper[4834]: E1126 12:12:28.417086 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:28 crc kubenswrapper[4834]: E1126 12:12:28.417099 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.451570 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.451615 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.451626 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.451639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.451650 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.553998 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.554028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.554037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.554049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.554060 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.655934 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.655972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.655980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.655995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.656005 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.680883 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/2.log" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.681419 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/1.log" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.683488 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea" exitCode=1 Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.683571 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.683668 4834 scope.go:117] "RemoveContainer" containerID="ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.684574 4834 scope.go:117] "RemoveContainer" containerID="5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea" Nov 26 12:12:28 crc kubenswrapper[4834]: E1126 12:12:28.684700 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.695097 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.709364 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.718847 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.727218 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.733762 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.742100 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.758402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.758484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.758495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.758512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.758522 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.761990 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.770024 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.779983 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.792585 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff090784fb64d91b7daf01d3247623b99d0da01a075df297d752518733180af1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:14Z\\\",\\\"message\\\":\\\"6270 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-5gvrf in node crc\\\\nI1126 12:12:14.293226 6270 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-5gvrf after 0 failed attempt(s)\\\\nI1126 12:12:14.293229 6270 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-5gvrf\\\\nI1126 12:12:14.293236 6270 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293240 6270 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1126 12:12:14.293245 6270 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF1126 12:12:14.293246 6270 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.803007 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.811293 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.821568 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.830950 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.840342 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.849468 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.857632 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:28Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.861639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.861684 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.861694 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.861714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.861726 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.964449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.964489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.964499 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.964518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:28 crc kubenswrapper[4834]: I1126 12:12:28.964528 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:28Z","lastTransitionTime":"2025-11-26T12:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.066286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.066378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.066392 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.066412 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.066427 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.105505 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.169498 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.169550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.169564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.169580 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.169592 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.271831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.271872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.271882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.271906 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.271923 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.374197 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.374245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.374257 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.374280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.374292 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.416466 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:29 crc kubenswrapper[4834]: E1126 12:12:29.416699 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.476903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.476965 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.476976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.476995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.477009 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.578918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.578955 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.578965 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.578982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.578995 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.680982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.681013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.681025 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.681039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.681055 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.687595 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/2.log" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.691101 4834 scope.go:117] "RemoveContainer" containerID="5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea" Nov 26 12:12:29 crc kubenswrapper[4834]: E1126 12:12:29.691285 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.708714 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.719825 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.731196 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.746474 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.756369 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.766106 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.776790 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.783491 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.783529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.783539 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.783554 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.783566 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.786644 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.796262 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.805857 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.813563 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.821061 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.830956 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.840821 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.849375 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.857123 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.865267 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:29Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.886275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.886331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.886346 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.886372 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.886384 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.988647 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.988691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.988705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.988726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:29 crc kubenswrapper[4834]: I1126 12:12:29.988740 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:29Z","lastTransitionTime":"2025-11-26T12:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.091661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.091737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.091749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.091768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.091782 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:30Z","lastTransitionTime":"2025-11-26T12:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.193719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.193752 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.193763 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.193793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.193802 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:30Z","lastTransitionTime":"2025-11-26T12:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.295839 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.295883 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.295895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.295912 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.295925 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:30Z","lastTransitionTime":"2025-11-26T12:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.398021 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.398077 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.398087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.398112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.398125 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:30Z","lastTransitionTime":"2025-11-26T12:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.416429 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.416429 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.416543 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:30 crc kubenswrapper[4834]: E1126 12:12:30.416687 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:30 crc kubenswrapper[4834]: E1126 12:12:30.416800 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:30 crc kubenswrapper[4834]: E1126 12:12:30.416855 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.499952 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.499979 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.499988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.500001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.500010 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:30Z","lastTransitionTime":"2025-11-26T12:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.603730 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.603856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.603919 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.604000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.604055 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:30Z","lastTransitionTime":"2025-11-26T12:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.706377 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.706438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.706449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.706475 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.706489 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:30Z","lastTransitionTime":"2025-11-26T12:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.808560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.808599 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.808611 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.808627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.808640 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:30Z","lastTransitionTime":"2025-11-26T12:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.911385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.911429 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.911441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.911455 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:30 crc kubenswrapper[4834]: I1126 12:12:30.911465 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:30Z","lastTransitionTime":"2025-11-26T12:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.013640 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.013680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.013693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.013705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.013715 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.116202 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.116253 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.116265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.116289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.116326 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.218025 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.218061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.218073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.218089 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.218123 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.320112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.320139 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.320148 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.320159 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.320167 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.416497 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:31 crc kubenswrapper[4834]: E1126 12:12:31.416658 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.422193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.422225 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.422236 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.422252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.422263 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.524367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.524413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.524428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.524445 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.524457 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.556242 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.566833 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.570076 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.579913 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.588224 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.595250 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.603962 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.611974 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.621197 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.626536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.626564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.626574 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.626588 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.626598 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.629909 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.638762 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.646908 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.653857 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.661080 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.670080 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.679781 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.691220 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.705506 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.721228 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:31Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.728678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.728709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.728718 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.728733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.728743 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.831206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.831246 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.831256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.831270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.831282 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.934028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.934058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.934066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.934084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:31 crc kubenswrapper[4834]: I1126 12:12:31.934093 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:31Z","lastTransitionTime":"2025-11-26T12:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.036026 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.036566 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.036636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.036723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.036809 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.139876 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.139923 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.139943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.139961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.139972 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.229228 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.229456 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:04.229429193 +0000 UTC m=+82.136642545 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.242766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.242791 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.242815 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.242828 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.242837 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.330178 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.330447 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.330561 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.330678 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.330288 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.330830 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.330859 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.330876 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.330577 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.330654 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.330944 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.330954 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.331070 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:13:04.330846613 +0000 UTC m=+82.238059965 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.331153 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 12:13:04.331141254 +0000 UTC m=+82.238354606 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.331231 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:13:04.331223266 +0000 UTC m=+82.238436609 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.331330 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 12:13:04.331298085 +0000 UTC m=+82.238511438 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.345803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.345830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.345839 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.345854 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.345863 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.416905 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.416943 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.416912 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.417030 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.417122 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:32 crc kubenswrapper[4834]: E1126 12:12:32.417193 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.428661 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.437873 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.447169 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.448196 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.448228 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.448239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.448251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.448262 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.457697 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.470988 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.489430 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.498797 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.507178 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.516892 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.523875 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.533415 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.541590 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.550420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.550445 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.550454 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.550468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.550477 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.551001 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.560035 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.567955 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.575954 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.583203 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.590389 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:32Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.652338 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.652376 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.652388 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.652408 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.652420 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.754743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.754807 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.754819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.754844 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.754856 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.856976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.857007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.857017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.857031 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.857042 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.964015 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.964127 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.964189 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.964259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:32 crc kubenswrapper[4834]: I1126 12:12:32.964372 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:32Z","lastTransitionTime":"2025-11-26T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.067039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.067459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.067525 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.067581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.067646 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.170293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.170368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.170377 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.170396 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.170410 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.272727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.272759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.272771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.272784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.272799 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.374544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.374589 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.374602 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.374617 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.374628 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.416222 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:33 crc kubenswrapper[4834]: E1126 12:12:33.416378 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.476706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.476748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.476759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.476776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.476790 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.578567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.578627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.578639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.578651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.578662 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.680445 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.680473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.680481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.680492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.680500 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.782553 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.782665 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.782726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.782782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.782833 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.884917 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.885011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.885481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.885555 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.885619 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.987991 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.988022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.988031 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.988044 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:33 crc kubenswrapper[4834]: I1126 12:12:33.988052 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:33Z","lastTransitionTime":"2025-11-26T12:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.005054 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.005113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.005122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.005137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.005148 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.015854 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:34Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.018879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.018970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.019050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.019116 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.019182 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.029014 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:34Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.031549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.031580 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.031610 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.031624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.031634 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.040792 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:34Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.043560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.043613 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.043625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.043637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.043651 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.045123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.045357 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.045439 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs podName:6feada4f-ea0c-4062-ab87-ff88a4590c96 nodeName:}" failed. No retries permitted until 2025-11-26 12:12:50.045418808 +0000 UTC m=+67.952632160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs") pod "network-metrics-daemon-tmlsw" (UID: "6feada4f-ea0c-4062-ab87-ff88a4590c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.052235 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:34Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.055057 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.055096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.055109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.055124 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.055134 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.063573 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:34Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.063710 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.089934 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.089963 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.089976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.089988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.089996 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.192448 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.192479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.192487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.192497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.192508 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.294393 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.294430 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.294441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.294452 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.294460 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.395929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.395953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.395982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.395994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.396002 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.416481 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.416514 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.416596 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.416481 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.416691 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:34 crc kubenswrapper[4834]: E1126 12:12:34.416749 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.497987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.498032 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.498041 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.498056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.498065 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.600194 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.600247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.600258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.600286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.600298 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.702747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.702789 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.702799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.702817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.702827 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.804624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.804674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.804686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.804703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.804717 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.906561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.906598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.906609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.906622 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:34 crc kubenswrapper[4834]: I1126 12:12:34.906635 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:34Z","lastTransitionTime":"2025-11-26T12:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.009371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.009409 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.009419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.009433 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.009446 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.111347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.111382 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.111394 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.111410 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.111422 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.213164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.213198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.213209 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.213220 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.213230 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.315121 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.315184 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.315195 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.315215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.315228 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.416524 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:35 crc kubenswrapper[4834]: E1126 12:12:35.416659 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.417840 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.417875 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.417886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.417903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.417915 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.519922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.519993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.520018 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.520037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.520054 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.622506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.622561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.622573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.622594 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.622606 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.724414 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.724470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.724480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.724500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.724513 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.826858 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.826909 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.826918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.826935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.826948 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.929741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.929786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.929797 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.929833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:35 crc kubenswrapper[4834]: I1126 12:12:35.929848 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:35Z","lastTransitionTime":"2025-11-26T12:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.032184 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.032235 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.032258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.032275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.032282 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.134661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.134714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.134726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.134740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.134749 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.236973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.237058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.237072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.237083 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.237092 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.339140 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.339174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.339237 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.339262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.339270 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.417066 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:36 crc kubenswrapper[4834]: E1126 12:12:36.417191 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.417504 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.417596 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:36 crc kubenswrapper[4834]: E1126 12:12:36.417685 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:36 crc kubenswrapper[4834]: E1126 12:12:36.417792 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.441138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.441197 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.441207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.441226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.441238 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.543051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.543089 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.543120 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.543133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.543142 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.644693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.644729 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.644738 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.644750 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.644759 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.746332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.746364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.746375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.746390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.746402 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.848016 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.848071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.848080 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.848097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.848111 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.950592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.950639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.950651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.950666 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:36 crc kubenswrapper[4834]: I1126 12:12:36.950680 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:36Z","lastTransitionTime":"2025-11-26T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.052413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.052452 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.052466 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.052481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.052491 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.155118 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.155151 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.155163 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.155198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.155208 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.256756 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.256864 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.256927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.256999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.257067 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.359681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.359724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.359733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.359749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.359765 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.416928 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:37 crc kubenswrapper[4834]: E1126 12:12:37.417050 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.461495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.461529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.461542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.461555 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.461568 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.563599 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.563651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.563661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.563671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.563679 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.666547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.666597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.666610 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.666628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.666640 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.768507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.768567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.768577 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.768594 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.768608 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.870325 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.870459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.870521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.870587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.870639 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.972327 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.972438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.972498 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.972573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:37 crc kubenswrapper[4834]: I1126 12:12:37.972625 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:37Z","lastTransitionTime":"2025-11-26T12:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.074538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.074655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.074716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.074772 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.074825 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.177510 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.177562 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.177572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.177591 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.177602 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.279617 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.279687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.279703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.279727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.279752 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.381265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.381300 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.381329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.381344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.381355 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.416368 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.416389 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:38 crc kubenswrapper[4834]: E1126 12:12:38.416543 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.416610 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:38 crc kubenswrapper[4834]: E1126 12:12:38.416750 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:38 crc kubenswrapper[4834]: E1126 12:12:38.416849 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.483383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.483430 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.483441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.483456 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.483467 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.585989 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.586037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.586051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.586072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.586084 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.688118 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.688163 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.688174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.688188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.688199 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.790587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.790630 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.790643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.790658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.790668 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.892926 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.893131 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.893225 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.893357 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.893445 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.996088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.996139 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.996150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.996169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:38 crc kubenswrapper[4834]: I1126 12:12:38.996182 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:38Z","lastTransitionTime":"2025-11-26T12:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.098279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.098337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.098348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.098360 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.098374 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:39Z","lastTransitionTime":"2025-11-26T12:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.204841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.204888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.204902 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.204918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.204934 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:39Z","lastTransitionTime":"2025-11-26T12:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.306977 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.307098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.307191 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.307283 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.307380 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:39Z","lastTransitionTime":"2025-11-26T12:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.409223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.409259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.409270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.409284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.409294 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:39Z","lastTransitionTime":"2025-11-26T12:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.416671 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:39 crc kubenswrapper[4834]: E1126 12:12:39.416794 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.511436 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.511464 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.511474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.511486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.511496 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:39Z","lastTransitionTime":"2025-11-26T12:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.612956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.612996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.613006 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.613027 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.613036 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:39Z","lastTransitionTime":"2025-11-26T12:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.714942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.714980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.714990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.715003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.715013 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:39Z","lastTransitionTime":"2025-11-26T12:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.817200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.817237 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.817247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.817260 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.817271 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:39Z","lastTransitionTime":"2025-11-26T12:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.919607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.919729 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.919795 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.919863 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:39 crc kubenswrapper[4834]: I1126 12:12:39.919922 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:39Z","lastTransitionTime":"2025-11-26T12:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.021945 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.022006 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.022019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.022038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.022051 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.124245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.124396 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.124466 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.124542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.124603 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.226699 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.226743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.226753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.226771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.226783 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.328782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.328821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.328831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.328847 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.328857 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.416220 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:40 crc kubenswrapper[4834]: E1126 12:12:40.416368 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.416532 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.416606 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:40 crc kubenswrapper[4834]: E1126 12:12:40.416677 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:40 crc kubenswrapper[4834]: E1126 12:12:40.416883 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.430348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.430388 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.430399 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.430416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.430429 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.532787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.532846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.532858 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.532869 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.532877 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.635074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.635119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.635129 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.635144 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.635155 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.736912 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.736954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.736964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.736978 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.736989 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.839372 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.839425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.839438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.839450 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.839458 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.941878 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.941923 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.941932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.941947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:40 crc kubenswrapper[4834]: I1126 12:12:40.941962 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:40Z","lastTransitionTime":"2025-11-26T12:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.044539 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.044588 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.044601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.044616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.044628 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.146717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.146768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.146776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.146792 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.146803 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.249282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.249344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.249354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.249372 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.249385 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.351949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.351980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.351988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.351999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.352008 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.416812 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:41 crc kubenswrapper[4834]: E1126 12:12:41.416919 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.417577 4834 scope.go:117] "RemoveContainer" containerID="5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea" Nov 26 12:12:41 crc kubenswrapper[4834]: E1126 12:12:41.417793 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.453657 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.453685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.453706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.453718 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.453726 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.555221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.555266 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.555275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.555298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.555324 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.659275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.659390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.659409 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.659441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.659463 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.762054 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.762093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.762102 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.762117 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.762127 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.864175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.864233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.864244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.864258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.864269 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.966443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.966490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.966499 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.966515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:41 crc kubenswrapper[4834]: I1126 12:12:41.966525 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:41Z","lastTransitionTime":"2025-11-26T12:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.068563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.068611 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.068623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.068641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.068654 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.170682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.170731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.170741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.170757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.170771 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.272710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.272754 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.272762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.272778 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.272795 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.375095 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.375127 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.375138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.375161 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.375173 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.416241 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.416241 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.416273 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:42 crc kubenswrapper[4834]: E1126 12:12:42.416457 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:42 crc kubenswrapper[4834]: E1126 12:12:42.416544 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:42 crc kubenswrapper[4834]: E1126 12:12:42.416642 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.427878 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.436936 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.453442 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.463241 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.474271 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.477173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.477201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.477211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.477227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.477240 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.485510 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.494251 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.509469 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.518037 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.529576 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.543276 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.551241 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.560885 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.571032 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.579391 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.579422 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.579432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.579446 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.579458 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.583132 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.592020 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.600279 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.609739 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:42Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.683293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.683381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.683392 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.683413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.683425 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.786073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.786121 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.786131 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.786160 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.786175 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.889164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.889212 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.889222 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.889244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.889256 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.991279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.991347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.991359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.991378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:42 crc kubenswrapper[4834]: I1126 12:12:42.991390 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:42Z","lastTransitionTime":"2025-11-26T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.093413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.093458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.093468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.093483 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.093494 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:43Z","lastTransitionTime":"2025-11-26T12:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.195075 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.195114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.195137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.195152 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.195165 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:43Z","lastTransitionTime":"2025-11-26T12:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.298407 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.298460 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.298470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.298487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.298500 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:43Z","lastTransitionTime":"2025-11-26T12:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.400629 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.400667 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.400678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.400690 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.400700 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:43Z","lastTransitionTime":"2025-11-26T12:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.416598 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:43 crc kubenswrapper[4834]: E1126 12:12:43.416792 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.502745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.502798 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.502811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.502834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.502847 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:43Z","lastTransitionTime":"2025-11-26T12:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.605105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.605177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.605187 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.605207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.605220 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:43Z","lastTransitionTime":"2025-11-26T12:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.707002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.707032 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.707042 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.707055 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.707064 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:43Z","lastTransitionTime":"2025-11-26T12:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.808954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.808981 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.808990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.809001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.809008 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:43Z","lastTransitionTime":"2025-11-26T12:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.910146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.910177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.910186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.910198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:43 crc kubenswrapper[4834]: I1126 12:12:43.910206 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:43Z","lastTransitionTime":"2025-11-26T12:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.012481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.012538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.012550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.012573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.012588 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.115402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.115453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.115462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.115480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.115495 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.195451 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.195502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.195514 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.195533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.195546 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: E1126 12:12:44.204668 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:44Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.208178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.208208 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.208216 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.208227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.208236 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: E1126 12:12:44.215918 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:44Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.218020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.218050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.218060 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.218072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.218080 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: E1126 12:12:44.227431 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:44Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.229807 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.229836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.229847 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.229859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.229867 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: E1126 12:12:44.239466 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:44Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.241543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.241575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.241587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.241598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.241608 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: E1126 12:12:44.248915 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:44Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:44 crc kubenswrapper[4834]: E1126 12:12:44.249028 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.249972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.249993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.250003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.250016 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.250026 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.352097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.352164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.352177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.352200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.352212 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.416221 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.416251 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.416271 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:44 crc kubenswrapper[4834]: E1126 12:12:44.416386 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:44 crc kubenswrapper[4834]: E1126 12:12:44.416499 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:44 crc kubenswrapper[4834]: E1126 12:12:44.416601 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.454273 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.454322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.454333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.454352 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.454366 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.559535 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.559571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.559582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.559597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.559611 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.661470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.661497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.661505 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.661519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.661529 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.763456 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.763516 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.763526 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.763544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.763556 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.865361 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.865393 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.865405 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.865416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.865428 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.967719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.967764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.967774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.967789 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:44 crc kubenswrapper[4834]: I1126 12:12:44.967802 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:44Z","lastTransitionTime":"2025-11-26T12:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.073651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.073683 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.073691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.073704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.073713 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.176641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.176692 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.176701 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.176716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.176726 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.279705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.279753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.279766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.279781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.279791 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.383101 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.383141 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.383153 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.383167 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.383178 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.417484 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:45 crc kubenswrapper[4834]: E1126 12:12:45.417609 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.484773 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.484821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.484833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.484849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.484863 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.586899 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.586934 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.586945 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.586961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.586970 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.689774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.689822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.689834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.689856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.689868 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.792216 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.792249 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.792258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.792276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.792287 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.894111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.894158 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.894169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.894187 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.894198 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.995972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.996014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.996026 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.996043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:45 crc kubenswrapper[4834]: I1126 12:12:45.996053 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:45Z","lastTransitionTime":"2025-11-26T12:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.098131 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.098178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.098189 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.098204 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.098218 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:46Z","lastTransitionTime":"2025-11-26T12:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.200558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.200613 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.200624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.200647 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.200661 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:46Z","lastTransitionTime":"2025-11-26T12:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.302664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.302707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.302717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.302731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.302741 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:46Z","lastTransitionTime":"2025-11-26T12:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.405100 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.405155 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.405164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.405181 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.405194 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:46Z","lastTransitionTime":"2025-11-26T12:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.416422 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.416479 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.416427 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:46 crc kubenswrapper[4834]: E1126 12:12:46.416544 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:46 crc kubenswrapper[4834]: E1126 12:12:46.416605 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:46 crc kubenswrapper[4834]: E1126 12:12:46.416677 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.507414 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.507458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.507468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.507485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.507499 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:46Z","lastTransitionTime":"2025-11-26T12:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.609357 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.609393 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.609402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.609413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.609423 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:46Z","lastTransitionTime":"2025-11-26T12:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.711693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.711723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.711732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.711746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.711772 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:46Z","lastTransitionTime":"2025-11-26T12:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.814168 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.814204 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.814214 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.814230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.814241 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:46Z","lastTransitionTime":"2025-11-26T12:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.915969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.916027 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.916041 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.916063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:46 crc kubenswrapper[4834]: I1126 12:12:46.916101 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:46Z","lastTransitionTime":"2025-11-26T12:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.018411 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.018450 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.018463 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.018477 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.018488 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.120857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.120911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.120922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.120935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.120945 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.223208 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.223244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.223254 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.223289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.223303 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.325250 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.325303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.325331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.325348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.325359 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.416952 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:47 crc kubenswrapper[4834]: E1126 12:12:47.417203 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.427252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.427284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.427294 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.427321 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.427544 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.529715 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.529771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.529783 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.529805 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.529820 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.631812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.631857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.631869 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.631887 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.631899 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.734557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.734605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.734619 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.734637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.734649 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.836746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.836780 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.836791 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.836804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.836814 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.939017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.939074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.939087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.939105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:47 crc kubenswrapper[4834]: I1126 12:12:47.939116 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:47Z","lastTransitionTime":"2025-11-26T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.041359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.041392 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.041402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.041419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.041431 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.143883 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.143939 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.143951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.143973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.143985 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.245966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.246001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.246011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.246053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.246066 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.348616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.348649 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.348658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.348674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.348687 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.416527 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.416576 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:48 crc kubenswrapper[4834]: E1126 12:12:48.416685 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.416529 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:48 crc kubenswrapper[4834]: E1126 12:12:48.416854 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:48 crc kubenswrapper[4834]: E1126 12:12:48.417028 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.451118 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.451150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.451159 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.451172 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.451183 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.553125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.553156 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.553165 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.553179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.553189 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.655816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.655869 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.655879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.655897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.655912 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.757596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.757634 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.757645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.757658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.757667 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.860106 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.860491 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.860567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.860664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.860725 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.963066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.963131 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.963144 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.963165 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:48 crc kubenswrapper[4834]: I1126 12:12:48.963191 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:48Z","lastTransitionTime":"2025-11-26T12:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.065093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.065137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.065152 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.065174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.065192 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.167701 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.167748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.167759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.167775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.167786 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.270103 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.270136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.270148 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.270162 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.270173 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.372580 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.372628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.372638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.372655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.372666 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.416819 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:49 crc kubenswrapper[4834]: E1126 12:12:49.417044 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.475829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.475882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.475893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.475913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.475925 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.578678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.578731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.578742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.578764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.578777 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.680770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.680812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.680824 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.680843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.680855 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.783420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.783477 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.783489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.783506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.783518 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.885772 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.885816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.885826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.885839 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.885853 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.988500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.988548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.988559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.988578 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:49 crc kubenswrapper[4834]: I1126 12:12:49.988589 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:49Z","lastTransitionTime":"2025-11-26T12:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.091190 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.091244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.091255 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.091273 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.091288 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:50Z","lastTransitionTime":"2025-11-26T12:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.096786 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:50 crc kubenswrapper[4834]: E1126 12:12:50.097080 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:50 crc kubenswrapper[4834]: E1126 12:12:50.097181 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs podName:6feada4f-ea0c-4062-ab87-ff88a4590c96 nodeName:}" failed. No retries permitted until 2025-11-26 12:13:22.097137584 +0000 UTC m=+100.004369220 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs") pod "network-metrics-daemon-tmlsw" (UID: "6feada4f-ea0c-4062-ab87-ff88a4590c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.193953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.193999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.194023 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.194040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.194054 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:50Z","lastTransitionTime":"2025-11-26T12:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.296094 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.296134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.296147 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.296166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.296178 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:50Z","lastTransitionTime":"2025-11-26T12:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.398243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.398288 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.398297 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.398328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.398339 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:50Z","lastTransitionTime":"2025-11-26T12:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.416079 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.416165 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.416209 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:50 crc kubenswrapper[4834]: E1126 12:12:50.416207 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:50 crc kubenswrapper[4834]: E1126 12:12:50.416337 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:50 crc kubenswrapper[4834]: E1126 12:12:50.416394 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.500212 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.500249 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.500257 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.500271 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.500280 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:50Z","lastTransitionTime":"2025-11-26T12:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.602033 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.602067 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.602078 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.602087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.602096 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:50Z","lastTransitionTime":"2025-11-26T12:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.703954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.703996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.704022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.704035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.704051 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:50Z","lastTransitionTime":"2025-11-26T12:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.805865 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.805917 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.805929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.805949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.805961 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:50Z","lastTransitionTime":"2025-11-26T12:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.907657 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.907686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.907694 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.907706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:50 crc kubenswrapper[4834]: I1126 12:12:50.907716 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:50Z","lastTransitionTime":"2025-11-26T12:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.009932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.009966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.009975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.009986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.009995 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.112422 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.112459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.112469 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.112482 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.112495 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.214141 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.214180 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.214192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.214206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.214218 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.316234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.316298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.316332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.316355 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.316367 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.417185 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:51 crc kubenswrapper[4834]: E1126 12:12:51.417361 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.419185 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.419279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.419333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.419353 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.419363 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.523339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.523377 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.523387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.523405 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.523417 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.625551 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.625724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.625848 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.625927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.626002 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.728835 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.728960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.729040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.729115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.729187 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.831480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.831525 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.831535 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.831552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.831563 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.933763 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.933881 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.933984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.934074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:51 crc kubenswrapper[4834]: I1126 12:12:51.934142 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:51Z","lastTransitionTime":"2025-11-26T12:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.038353 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.038410 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.038425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.038445 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.038460 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.141149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.141210 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.141220 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.141234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.141244 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.244039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.244079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.244091 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.244107 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.244119 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.347384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.347431 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.347441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.347455 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.347465 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.416962 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.417196 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:52 crc kubenswrapper[4834]: E1126 12:12:52.417296 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:52 crc kubenswrapper[4834]: E1126 12:12:52.417453 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.417917 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:52 crc kubenswrapper[4834]: E1126 12:12:52.418118 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.430626 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.441783 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.449173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.449209 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.449221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.449237 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.449246 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.454232 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.467212 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.478410 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.490848 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.499846 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.508279 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.517471 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.526765 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.535911 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.543092 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.552192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.552247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.552262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.552286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.552300 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.552895 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.562831 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.582197 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.591883 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.602379 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.619848 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.654660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.654712 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.654724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.654742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.654754 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.756087 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k8hjt_234b786b-76dd-4238-81bd-a743042bece9/kube-multus/0.log" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.756158 4834 generic.go:334] "Generic (PLEG): container finished" podID="234b786b-76dd-4238-81bd-a743042bece9" containerID="a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea" exitCode=1 Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.756203 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k8hjt" event={"ID":"234b786b-76dd-4238-81bd-a743042bece9","Type":"ContainerDied","Data":"a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.756710 4834 scope.go:117] "RemoveContainer" containerID="a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.757663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.757700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.757712 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.757730 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.757743 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.774974 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.785551 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.796913 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.812094 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.822038 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.832921 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.844749 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.855583 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.860197 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.860239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.860250 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.860268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.860284 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.867143 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.877033 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.886849 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:51Z\\\",\\\"message\\\":\\\"2025-11-26T12:12:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f\\\\n2025-11-26T12:12:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f to /host/opt/cni/bin/\\\\n2025-11-26T12:12:06Z [verbose] multus-daemon started\\\\n2025-11-26T12:12:06Z [verbose] Readiness Indicator file check\\\\n2025-11-26T12:12:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.896146 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.904207 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.914057 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.921648 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.929344 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.937071 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.944394 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:52Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.962716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.962766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.962781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.962800 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:52 crc kubenswrapper[4834]: I1126 12:12:52.962810 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:52Z","lastTransitionTime":"2025-11-26T12:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.065181 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.065248 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.065262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.065283 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.065298 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.167484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.167534 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.167544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.167561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.167572 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.269519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.269544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.269559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.269571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.269579 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.372836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.372888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.372903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.372923 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.372934 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.416735 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:53 crc kubenswrapper[4834]: E1126 12:12:53.416901 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.417631 4834 scope.go:117] "RemoveContainer" containerID="5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.475633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.475687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.475703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.475723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.475737 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.578802 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.578842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.578852 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.578884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.578897 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.681339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.681384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.681396 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.681413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.681425 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.760917 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/2.log" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.763413 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.763845 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.765145 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k8hjt_234b786b-76dd-4238-81bd-a743042bece9/kube-multus/0.log" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.765194 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k8hjt" event={"ID":"234b786b-76dd-4238-81bd-a743042bece9","Type":"ContainerStarted","Data":"516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.783102 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.783146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.783157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.783170 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.783180 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.783619 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.791789 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.805302 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.820351 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.830065 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.839675 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.848136 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.857033 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.865963 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.879644 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:51Z\\\",\\\"message\\\":\\\"2025-11-26T12:12:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f\\\\n2025-11-26T12:12:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f to /host/opt/cni/bin/\\\\n2025-11-26T12:12:06Z [verbose] multus-daemon started\\\\n2025-11-26T12:12:06Z [verbose] Readiness Indicator file check\\\\n2025-11-26T12:12:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.885420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.885449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.885458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.885473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.885482 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.892015 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.906252 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.917760 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.926342 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.934618 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.944246 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.955917 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.964083 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.973056 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.982017 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.987257 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.987291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.987301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.987331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.987343 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:53Z","lastTransitionTime":"2025-11-26T12:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.990191 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:53 crc kubenswrapper[4834]: I1126 12:12:53.998975 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:51Z\\\",\\\"message\\\":\\\"2025-11-26T12:12:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f\\\\n2025-11-26T12:12:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f to /host/opt/cni/bin/\\\\n2025-11-26T12:12:06Z [verbose] multus-daemon started\\\\n2025-11-26T12:12:06Z [verbose] Readiness Indicator file check\\\\n2025-11-26T12:12:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:53Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.007811 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.017786 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.026951 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.035917 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.045554 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.052242 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.059118 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.068651 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.077729 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.085598 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.089677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.089716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.089729 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.089746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.089757 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.101277 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.117746 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.131796 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.139923 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.191907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.191953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.191966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.191987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.192002 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.294257 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.294303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.294333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.294354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.294368 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.396008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.396043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.396055 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.396072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.396084 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.416737 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.416745 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.416849 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.416895 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.417046 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.417124 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.485300 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.485354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.485365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.485378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.485388 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.496160 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.499325 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.499359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.499371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.499383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.499392 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.512796 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.516457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.516493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.516504 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.516520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.516529 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.525337 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.528493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.528531 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.528541 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.528561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.528573 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.538710 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.541661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.541695 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.541705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.541721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.541730 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.552692 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.552831 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.554507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.554544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.554561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.554576 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.554589 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.657536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.657590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.657601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.657620 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.657637 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.759837 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.759888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.759901 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.759926 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.759938 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.769146 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/3.log" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.769653 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/2.log" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.771881 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" exitCode=1 Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.771929 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.771968 4834 scope.go:117] "RemoveContainer" containerID="5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.772832 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:12:54 crc kubenswrapper[4834]: E1126 12:12:54.773247 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.785278 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.795142 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.803212 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.811787 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.823475 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.834172 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.842793 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.858049 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.861857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.861897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.861911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.861929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.861942 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.871975 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5943bf63d72373887aaa90f915c5d45035adeabafb88419ae0e7c1904788dfea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:28Z\\\",\\\"message\\\":\\\"43] Built service openshift-machine-api/machine-api-controllers LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8441, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8442, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.167\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8444, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1126 12:12:28.146046 6492 services_controller.go:444] Built service openshift-machine-api/machine-api-controllers LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1126 12:12:28.146042 6492 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 12:12:54.152380 6843 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 12:12:54.151174 6843 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-4rwmt\\\\nF1126 12:12:54.152353 6843 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.886466 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.894772 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.904258 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.914484 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.922858 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.932924 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:51Z\\\",\\\"message\\\":\\\"2025-11-26T12:12:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f\\\\n2025-11-26T12:12:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f to /host/opt/cni/bin/\\\\n2025-11-26T12:12:06Z [verbose] multus-daemon started\\\\n2025-11-26T12:12:06Z [verbose] Readiness Indicator file check\\\\n2025-11-26T12:12:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.942805 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.952401 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.961551 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:54Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.964416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.964452 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.964463 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.964480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:54 crc kubenswrapper[4834]: I1126 12:12:54.964492 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:54Z","lastTransitionTime":"2025-11-26T12:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.067151 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.067206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.067232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.067260 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.067273 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.170184 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.170221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.170244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.170258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.170268 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.273163 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.273226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.273261 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.273293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.273327 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.375734 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.375777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.375787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.375808 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.375820 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.416453 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:55 crc kubenswrapper[4834]: E1126 12:12:55.416710 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.427500 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.477706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.477754 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.477767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.477783 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.477797 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.580877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.580909 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.580918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.580930 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.580942 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.683132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.683183 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.683195 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.683218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.683232 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.776127 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/3.log" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.780427 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:12:55 crc kubenswrapper[4834]: E1126 12:12:55.780653 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.784857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.784896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.784908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.784927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.784942 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.800691 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.811107 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.821968 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.836472 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 12:12:54.152380 6843 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 12:12:54.151174 6843 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-4rwmt\\\\nF1126 12:12:54.152353 6843 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.844899 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.855046 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:51Z\\\",\\\"message\\\":\\\"2025-11-26T12:12:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f\\\\n2025-11-26T12:12:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f to /host/opt/cni/bin/\\\\n2025-11-26T12:12:06Z [verbose] multus-daemon started\\\\n2025-11-26T12:12:06Z [verbose] Readiness Indicator file check\\\\n2025-11-26T12:12:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.863723 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.873502 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.887925 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.888040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.888145 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.888258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.888639 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.890107 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.900362 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.911177 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.920852 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.929773 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.937603 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc183bea-c53d-4870-8a69-4b812b45aa4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66866c33a678a8db1b8f2aad639dfb35a4db4c640e07262f340fb0d35025b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d6f06b314de361e81e7b6be89722a06fe4a2c0c3418d3a3b0c8254babe1baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99d6f06b314de361e81e7b6be89722a06fe4a2c0c3418d3a3b0c8254babe1baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.947999 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.957585 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.966432 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.975339 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.984255 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:12:55Z is after 2025-08-24T17:21:41Z" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.990790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.990892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.990956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.991024 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:55 crc kubenswrapper[4834]: I1126 12:12:55.991092 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:55Z","lastTransitionTime":"2025-11-26T12:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.093538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.093664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.093746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.093815 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.093884 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:56Z","lastTransitionTime":"2025-11-26T12:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.195670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.195843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.195926 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.195991 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.196056 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:56Z","lastTransitionTime":"2025-11-26T12:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.298179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.298285 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.298401 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.298468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.298530 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:56Z","lastTransitionTime":"2025-11-26T12:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.400953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.401077 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.401092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.401107 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.401117 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:56Z","lastTransitionTime":"2025-11-26T12:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.417180 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.417199 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.417372 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:56 crc kubenswrapper[4834]: E1126 12:12:56.417581 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:56 crc kubenswrapper[4834]: E1126 12:12:56.417684 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:56 crc kubenswrapper[4834]: E1126 12:12:56.417812 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.504024 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.504083 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.504101 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.504121 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.504136 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:56Z","lastTransitionTime":"2025-11-26T12:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.605843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.605890 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.605900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.605918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.605930 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:56Z","lastTransitionTime":"2025-11-26T12:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.708474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.708954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.709041 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.709127 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.709199 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:56Z","lastTransitionTime":"2025-11-26T12:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.811362 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.811394 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.811404 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.811418 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.811428 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:56Z","lastTransitionTime":"2025-11-26T12:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.912834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.912857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.912867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.912883 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:56 crc kubenswrapper[4834]: I1126 12:12:56.912895 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:56Z","lastTransitionTime":"2025-11-26T12:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.015108 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.015133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.015141 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.015150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.015157 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.116896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.116926 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.116936 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.116947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.116955 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.218582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.218627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.218637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.218650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.218659 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.320907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.321272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.321369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.321454 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.321515 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.416893 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:57 crc kubenswrapper[4834]: E1126 12:12:57.417014 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.423291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.423406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.423473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.423538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.423603 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.526160 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.526252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.526364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.526449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.526512 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.628139 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.628169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.628178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.628189 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.628197 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.730341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.730382 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.730392 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.730406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.730428 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.831885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.831933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.831944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.831962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.831972 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.934920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.934972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.934983 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.935004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:57 crc kubenswrapper[4834]: I1126 12:12:57.935017 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:57Z","lastTransitionTime":"2025-11-26T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.036998 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.037153 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.037219 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.037286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.037371 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.139260 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.139343 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.139361 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.139383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.139397 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.242122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.242165 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.242176 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.242193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.242206 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.344037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.344099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.344110 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.344124 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.344133 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.416434 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.416488 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:12:58 crc kubenswrapper[4834]: E1126 12:12:58.416531 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:12:58 crc kubenswrapper[4834]: E1126 12:12:58.416583 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.416727 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:12:58 crc kubenswrapper[4834]: E1126 12:12:58.416786 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.445698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.445730 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.445738 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.445749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.445760 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.548083 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.548138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.548149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.548161 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.548171 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.650208 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.650248 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.650258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.650272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.650282 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.752003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.752038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.752047 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.752059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.752067 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.853472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.853521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.853530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.853544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.853554 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.955571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.955610 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.955619 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.955651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:58 crc kubenswrapper[4834]: I1126 12:12:58.955661 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:58Z","lastTransitionTime":"2025-11-26T12:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.058081 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.058122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.058132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.058147 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.058158 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.160421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.160456 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.160467 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.160479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.160486 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.263146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.263218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.263232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.263259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.263275 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.365054 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.365098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.365108 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.365126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.365136 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.416829 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:12:59 crc kubenswrapper[4834]: E1126 12:12:59.417065 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.467202 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.467248 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.467258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.467271 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.467281 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.569495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.569558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.569571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.569605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.569615 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.670866 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.670896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.670905 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.670916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.670929 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.772947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.772984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.772996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.773009 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.773019 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.874877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.874912 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.874922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.874950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.874959 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.976021 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.976049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.976059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.976070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:12:59 crc kubenswrapper[4834]: I1126 12:12:59.976079 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:12:59Z","lastTransitionTime":"2025-11-26T12:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.077630 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.077685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.077696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.077709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.077717 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.179584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.179678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.179691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.179714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.179726 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.281837 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.281886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.281897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.281910 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.281920 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.384896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.384928 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.384948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.384961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.384971 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.416391 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.416443 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:00 crc kubenswrapper[4834]: E1126 12:13:00.416515 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.416541 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:00 crc kubenswrapper[4834]: E1126 12:13:00.416627 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:00 crc kubenswrapper[4834]: E1126 12:13:00.416701 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.487113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.487139 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.487147 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.487158 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.487167 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.589575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.589631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.589643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.589654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.589663 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.691834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.691864 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.691873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.691888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.691898 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.793601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.793632 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.793650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.793662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.793670 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.895397 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.895447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.895458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.895478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.895487 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.998427 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.998493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.998506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.998531 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:00 crc kubenswrapper[4834]: I1126 12:13:00.998542 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:00Z","lastTransitionTime":"2025-11-26T12:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.101340 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.101406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.101417 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.101442 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.101456 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:01Z","lastTransitionTime":"2025-11-26T12:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.203400 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.203459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.203475 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.203493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.203507 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:01Z","lastTransitionTime":"2025-11-26T12:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.306008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.306056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.306068 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.306085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.306097 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:01Z","lastTransitionTime":"2025-11-26T12:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.408462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.408495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.408503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.408515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.408523 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:01Z","lastTransitionTime":"2025-11-26T12:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.416991 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:01 crc kubenswrapper[4834]: E1126 12:13:01.417091 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.510627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.510704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.510721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.510746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.510764 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:01Z","lastTransitionTime":"2025-11-26T12:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.613297 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.613371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.613439 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.613458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.613471 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:01Z","lastTransitionTime":"2025-11-26T12:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.715789 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.715861 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.716067 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.716085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.716095 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:01Z","lastTransitionTime":"2025-11-26T12:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.818989 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.819030 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.819063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.819080 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.819094 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:01Z","lastTransitionTime":"2025-11-26T12:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.922019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.922076 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.922087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.922108 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:01 crc kubenswrapper[4834]: I1126 12:13:01.922336 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:01Z","lastTransitionTime":"2025-11-26T12:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.025682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.025724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.025743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.025756 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.025769 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.128951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.128998 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.129009 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.129026 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.129038 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.231989 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.232049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.232061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.232083 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.232097 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.334414 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.334465 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.334476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.334492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.334502 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.416908 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.416949 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:02 crc kubenswrapper[4834]: E1126 12:13:02.417021 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.417032 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:02 crc kubenswrapper[4834]: E1126 12:13:02.418063 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:02 crc kubenswrapper[4834]: E1126 12:13:02.418170 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.428702 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.436974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.437012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.437022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.437037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.437046 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.437283 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.450211 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.458164 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.470496 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.489323 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 12:12:54.152380 6843 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 12:12:54.151174 6843 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-4rwmt\\\\nF1126 12:12:54.152353 6843 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.500901 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.510635 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.520989 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.531057 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.539130 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.539168 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.539178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.539193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.539203 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.541327 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.548520 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.557428 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:51Z\\\",\\\"message\\\":\\\"2025-11-26T12:12:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f\\\\n2025-11-26T12:12:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f to /host/opt/cni/bin/\\\\n2025-11-26T12:12:06Z [verbose] multus-daemon started\\\\n2025-11-26T12:12:06Z [verbose] Readiness Indicator file check\\\\n2025-11-26T12:12:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.564363 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc183bea-c53d-4870-8a69-4b812b45aa4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66866c33a678a8db1b8f2aad639dfb35a4db4c640e07262f340fb0d35025b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d6f06b314de361e81e7b6be89722a06fe4a2c0c3418d3a3b0c8254babe1baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99d6f06b314de361e81e7b6be89722a06fe4a2c0c3418d3a3b0c8254babe1baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.572961 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.581298 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.590566 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.597518 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.604643 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:02Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.641218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.641253 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.641262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.641275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.641284 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.742512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.742552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.742561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.742575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.742584 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.844284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.844339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.844349 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.844362 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.844373 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.946253 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.946281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.946290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.946302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:02 crc kubenswrapper[4834]: I1126 12:13:02.946335 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:02Z","lastTransitionTime":"2025-11-26T12:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.048862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.048920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.048929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.048942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.048950 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.152999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.153043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.153052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.153065 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.153074 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.254635 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.254671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.254681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.254696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.254706 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.356719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.356746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.356756 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.356769 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.356778 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.416544 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:03 crc kubenswrapper[4834]: E1126 12:13:03.416673 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.459664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.459717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.459728 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.459745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.459756 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.562395 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.562433 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.562443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.562457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.562468 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.664778 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.664833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.664843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.664877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.664888 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.766656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.766698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.766711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.766730 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.766742 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.868882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.868941 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.868952 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.868967 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.868981 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.970239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.970272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.970282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.970298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:03 crc kubenswrapper[4834]: I1126 12:13:03.970328 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:03Z","lastTransitionTime":"2025-11-26T12:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.072624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.072659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.072669 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.072682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.072692 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.174634 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.174676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.174686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.174704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.174718 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.277234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.277291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.277301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.277354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.277370 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.326453 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.326644 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:14:08.326609318 +0000 UTC m=+146.233822680 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.380226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.380270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.380283 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.380298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.380344 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.416243 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.417434 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.417451 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.417583 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.417636 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.417706 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.427120 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.427179 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.427211 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.427252 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427270 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427346 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:14:08.427325812 +0000 UTC m=+146.334539164 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427382 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427438 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427443 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427458 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427592 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 12:14:08.427563938 +0000 UTC m=+146.334777291 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427381 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427626 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427654 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427627 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 12:14:08.42761625 +0000 UTC m=+146.334829612 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.427698 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 12:14:08.427684334 +0000 UTC m=+146.334897686 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.482604 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.482642 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.482653 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.482666 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.482677 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.585656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.585689 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.585700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.585715 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.585729 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.688492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.688540 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.688551 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.688572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.688584 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.790907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.790956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.790966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.790987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.790999 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.873586 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.873646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.873656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.873683 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.873696 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.886202 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.890001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.890034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.890046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.890062 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.890075 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.899574 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.902639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.902678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.902687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.902700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.902710 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.912298 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.921351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.921398 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.921408 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.921420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.921428 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.929851 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.932879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.932907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.932932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.932950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.932959 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.941952 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:04Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:04 crc kubenswrapper[4834]: E1126 12:13:04.942056 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.943010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.943086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.943095 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.943105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:04 crc kubenswrapper[4834]: I1126 12:13:04.943114 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:04Z","lastTransitionTime":"2025-11-26T12:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.045470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.045519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.045530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.045545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.045555 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.147797 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.147859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.147871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.147896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.147914 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.249598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.249629 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.249637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.249652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.249661 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.352259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.352287 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.352298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.352331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.352341 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.416653 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:05 crc kubenswrapper[4834]: E1126 12:13:05.416755 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.454539 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.454575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.454585 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.454603 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.454614 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.557519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.557560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.557569 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.557584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.557595 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.659765 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.659820 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.659829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.659841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.659850 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.762012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.762040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.762049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.762062 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.762070 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.864072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.864099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.864108 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.864119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.864128 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.966529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.966579 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.966590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.966609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:05 crc kubenswrapper[4834]: I1126 12:13:05.966619 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:05Z","lastTransitionTime":"2025-11-26T12:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.068732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.068770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.068781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.068800 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.068813 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.170453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.170484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.170494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.170507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.170515 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.272436 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.272478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.272489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.272502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.272512 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.373856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.373886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.373900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.373913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.373922 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.416766 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.416825 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:06 crc kubenswrapper[4834]: E1126 12:13:06.416896 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.416779 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:06 crc kubenswrapper[4834]: E1126 12:13:06.417012 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:06 crc kubenswrapper[4834]: E1126 12:13:06.417098 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.476168 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.476200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.476210 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.476224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.476235 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.578228 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.578258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.578269 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.578280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.578290 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.680145 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.680174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.680184 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.680194 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.680204 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.781613 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.781861 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.781944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.782011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.782084 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.884500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.884538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.884549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.884561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.884575 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.986898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.986930 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.986941 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.986953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:06 crc kubenswrapper[4834]: I1126 12:13:06.986962 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:06Z","lastTransitionTime":"2025-11-26T12:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.088806 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.088853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.088863 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.088880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.088894 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:07Z","lastTransitionTime":"2025-11-26T12:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.190542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.190569 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.190578 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.190590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.190599 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:07Z","lastTransitionTime":"2025-11-26T12:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.292715 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.292759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.292770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.292784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.292794 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:07Z","lastTransitionTime":"2025-11-26T12:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.394364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.394409 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.394419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.394440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.394458 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:07Z","lastTransitionTime":"2025-11-26T12:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.416724 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:07 crc kubenswrapper[4834]: E1126 12:13:07.416840 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.496537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.496601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.496614 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.496629 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.496644 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:07Z","lastTransitionTime":"2025-11-26T12:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.598863 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.599115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.599198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.599262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.599346 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:07Z","lastTransitionTime":"2025-11-26T12:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.701328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.701374 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.701384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.701401 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.701412 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:07Z","lastTransitionTime":"2025-11-26T12:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.803472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.803502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.803511 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.803524 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.803531 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:07Z","lastTransitionTime":"2025-11-26T12:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.905916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.905976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.906010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.906035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:07 crc kubenswrapper[4834]: I1126 12:13:07.906051 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:07Z","lastTransitionTime":"2025-11-26T12:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.008444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.008484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.008493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.008506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.008516 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.111156 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.111213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.111227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.111247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.111260 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.214070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.214112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.214125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.214145 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.214156 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.316327 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.316371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.316384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.316400 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.316411 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.416264 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.416302 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.416391 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:08 crc kubenswrapper[4834]: E1126 12:13:08.416431 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:08 crc kubenswrapper[4834]: E1126 12:13:08.416528 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:08 crc kubenswrapper[4834]: E1126 12:13:08.416702 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.417935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.417978 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.417993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.418010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.418024 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.520734 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.520775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.520786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.520802 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.520816 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.622794 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.622842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.622852 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.622868 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.622879 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.725286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.725335 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.725346 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.725357 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.725364 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.827215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.827326 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.827340 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.827359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.827370 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.929595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.929637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.929648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.929662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:08 crc kubenswrapper[4834]: I1126 12:13:08.929674 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:08Z","lastTransitionTime":"2025-11-26T12:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.031860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.031894 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.031904 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.031916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.031924 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.133947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.134006 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.134016 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.134038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.134055 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.235959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.235996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.236006 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.236020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.236031 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.338207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.338246 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.338305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.338353 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.338373 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.416402 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:09 crc kubenswrapper[4834]: E1126 12:13:09.416561 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.446115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.446151 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.446162 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.446175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.446185 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.548150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.548185 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.548195 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.548207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.548218 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.650968 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.651007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.651018 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.651029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.651039 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.753176 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.753224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.753236 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.753259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.753274 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.855192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.855229 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.855239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.855252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.855263 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.957257 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.957293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.957303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.957344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:09 crc kubenswrapper[4834]: I1126 12:13:09.957355 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:09Z","lastTransitionTime":"2025-11-26T12:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.059480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.059510 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.059520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.059537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.059548 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.161637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.161671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.161681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.161696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.161706 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.264030 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.264074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.264084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.264097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.264106 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.365871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.365896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.365905 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.365916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.365924 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.416850 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.417066 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:10 crc kubenswrapper[4834]: E1126 12:13:10.417148 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.417193 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:10 crc kubenswrapper[4834]: E1126 12:13:10.417290 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:10 crc kubenswrapper[4834]: E1126 12:13:10.417652 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.467852 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.467886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.467895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.467905 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.467913 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.569725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.569754 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.569764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.569778 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.569790 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.672302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.672387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.672399 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.672424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.672440 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.774748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.774784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.774794 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.774808 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.774818 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.876419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.876452 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.876460 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.876473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.876483 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.978146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.978180 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.978189 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.978200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:10 crc kubenswrapper[4834]: I1126 12:13:10.978207 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:10Z","lastTransitionTime":"2025-11-26T12:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.080235 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.080292 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.080303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.080333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.080343 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:11Z","lastTransitionTime":"2025-11-26T12:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.182345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.182411 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.182424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.182436 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.182444 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:11Z","lastTransitionTime":"2025-11-26T12:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.284446 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.284500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.284511 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.284523 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.284538 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:11Z","lastTransitionTime":"2025-11-26T12:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.387154 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.387214 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.387226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.387249 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.387264 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:11Z","lastTransitionTime":"2025-11-26T12:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.416678 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:11 crc kubenswrapper[4834]: E1126 12:13:11.417077 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.417233 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:13:11 crc kubenswrapper[4834]: E1126 12:13:11.417401 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.489003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.489056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.489069 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.489089 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.489104 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:11Z","lastTransitionTime":"2025-11-26T12:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.592367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.592457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.592468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.592482 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.592491 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:11Z","lastTransitionTime":"2025-11-26T12:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.694774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.694827 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.694836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.694852 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.694863 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:11Z","lastTransitionTime":"2025-11-26T12:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.796803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.796846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.796859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.796872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.796885 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:11Z","lastTransitionTime":"2025-11-26T12:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.898811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.898867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.898879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.898902 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:11 crc kubenswrapper[4834]: I1126 12:13:11.898916 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:11Z","lastTransitionTime":"2025-11-26T12:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.000472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.000511 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.000521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.000543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.000555 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.102740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.102767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.102776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.102787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.102796 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.204880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.204911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.204921 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.204930 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.204938 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.307019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.307073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.307086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.307109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.307125 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.408344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.408373 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.408381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.408391 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.408400 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.416796 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.416830 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:12 crc kubenswrapper[4834]: E1126 12:13:12.416929 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.416964 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:12 crc kubenswrapper[4834]: E1126 12:13:12.417086 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:12 crc kubenswrapper[4834]: E1126 12:13:12.417142 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.427147 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.436595 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.443421 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5gvrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a5b7690-00d4-4ca0-8d22-b236f5d25580\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ffd72d5183fc75865b11c9a5f8aeb25d372325058274c213c29d3bcceb7569e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc2sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5gvrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.450830 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6feada4f-ea0c-4062-ab87-ff88a4590c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvfrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tmlsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.459480 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc183bea-c53d-4870-8a69-4b812b45aa4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66866c33a678a8db1b8f2aad639dfb35a4db4c640e07262f340fb0d35025b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d6f06b314de361e81e7b6be89722a06fe4a2c0c3418d3a3b0c8254babe1baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99d6f06b314de361e81e7b6be89722a06fe4a2c0c3418d3a3b0c8254babe1baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.468914 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1ffcfb4cbf5e423e3cfe500f0045b56e656ed99e81348151a0df9578f672504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.477692 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5b1dcba-68fa-4bea-be2d-de349512510b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6efd9359e976d04536ae799a834912315cc83c84c3dffb772cc04e5e46c9402b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74311ad050995cbb98a2e9be027a857e58c59a113bc5446386b3d0f668b503cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d71d8f32418cc1d676a30e9cc6a8a5d5aeacb0c90acfcac5e65227d0d13edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18ad3b143654842bb7262ca09736d1fcf0c76ec51b57d839a49154f8daea0427\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.487222 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d6a40a1-4fc5-447c-9f19-ce50904ebaaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e7c3fe34b1f864b5484bec9c243afded64698a5afb557ef83ae2109f25d9e44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ea85335817577413bb249f8be48e83ae2ab77714e8365d20aacaf448050ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk8qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-br5pb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.498856 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7910fe0-c205-465a-b8b5-9b56d8bb1941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546a9462e79a27ef94f2282fe07b7b111181c919acb920460a2f5adcf593b5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ca45c91a9e3884a17a39b4cee31279da440a27ae665e50ba13a612ca200ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76dcacca9bcfc4e2c27e8a595951a3432518500d25df5cceff1389245b5b517f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d165d63a382a7e42775e7b0f4740746ce6062a842c7180f1bc3d14bd565fdf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1930d60b324d38e98c40728b3ee7d949a155cb869f48b3c7780daf1f6d6e626\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a014e3d4196fe0633b35cf92c938f05cc491f0eccfed053c089e0c115561b6f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d94f149ccb111c616f0d52b0ad13d45c163f53d016849c8ad63132bec879d7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwrbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q5s5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.509960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.509985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.509995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.510010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.510021 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.511014 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7f44620-97b4-4cdb-8252-d8a2971830fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:54Z\\\",\\\"message\\\":\\\"_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 12:12:54.152380 6843 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1126 12:12:54.151174 6843 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-4rwmt\\\\nF1126 12:12:54.152353 6843 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plkpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9dvt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.524897 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a216eece-eb40-4ec8-87e4-6ea72a35c64e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0833bf0a2df22482211b09be6affcd0f5ba8cf9e55ff1b9b05b24ca651c17f6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f26b117525f1cc334d25bebfb15a04d13f1255ae7545ffc46382878c768b1093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a36fa5527ec91dfb607d82e7229356881665cf82dde316ade0f3f5677d747c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f52a7066678fe6d858682ffccf7963189214e04e8a357cdb2af6438b7443bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91623541dadc2fe3a44bc001dd069e36001157163fa522926bee0ae45b9fcbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9235ecc280d32f52a5402588c5816ff7d4846a683e4478dfc1640de6eba6fe3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b5fe0ea17b6c78c45688b7042cb7b5109543e3844ca59e1c935386e732817b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c887f6c6de057eacbea26d583381bc19be28684fe0f924c41abbb757f85f4c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.534867 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb17aaf2f1b58cde03569166cf732c543997a10c792040cb29f58806a69915d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.544215 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.554585 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0b6bd0d8e7e712ea36eed19ef215f3a4a2ccb7da620597ee0294eaf00920be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a76d123a9e58997f6d88c68eb6654f494952ca5bfbc9a2d1d9981ea56f685c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.563241 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rwmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7c651b3-b6f5-4af8-9cc7-4728f137227a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc2bd9f584d1bf89e60fcb44d7075621d16ee400b5cf7d586c519cb3557eef7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx8mg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rwmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.572769 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-k8hjt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"234b786b-76dd-4238-81bd-a743042bece9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T12:12:51Z\\\",\\\"message\\\":\\\"2025-11-26T12:12:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f\\\\n2025-11-26T12:12:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_01869cfa-14e2-4d7e-aae2-211786f7bd4f to /host/opt/cni/bin/\\\\n2025-11-26T12:12:06Z [verbose] multus-daemon started\\\\n2025-11-26T12:12:06Z [verbose] Readiness Indicator file check\\\\n2025-11-26T12:12:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:12:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vlxwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-k8hjt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.581107 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15e8745-fc1a-4575-ac07-e483f8e41c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f44bc5907ee0e2ad9bba83572cb99131936fdccec662fee15c0715a16de5b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6v8l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:12:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzb52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.591008 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3e686b-b5f9-4829-b768-68d16850643e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T12:11:59Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 12:11:54.383565 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 12:11:54.385079 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2279213654/tls.crt::/tmp/serving-cert-2279213654/tls.key\\\\\\\"\\\\nI1126 12:11:59.813288 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 12:11:59.815138 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 12:11:59.815154 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 12:11:59.815177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 12:11:59.815182 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 12:11:59.819386 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1126 12:11:59.819408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1126 12:11:59.819410 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 12:11:59.819431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 12:11:59.819434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 12:11:59.819436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 12:11:59.819439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1126 12:11:59.821036 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T12:11:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.600236 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a39cd7-9740-4af1-9dba-91f829a7df6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T12:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a107229fbc4ff5f3dfbd1586d39f13bf83e08a42aad411ae06c289f70df853c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5c32d267613dff95f0d2b69811da3d552a2fbc6b87efcea1a3c60c50f40e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aadbfd9a890e88817d9960bc50338c5452c6fc52c8ea462a2babe399aa57725c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T12:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T12:11:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:12Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.612366 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.612399 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.612412 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.612428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.612437 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.715064 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.715118 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.715132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.715151 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.715167 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.817206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.817412 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.817485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.817585 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.817650 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.919648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.919764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.919834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.919907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:12 crc kubenswrapper[4834]: I1126 12:13:12.919968 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:12Z","lastTransitionTime":"2025-11-26T12:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.021729 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.021774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.021785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.021802 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.021815 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.123365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.123393 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.123402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.123411 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.123420 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.225372 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.225398 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.225408 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.225423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.225435 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.327818 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.327840 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.327848 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.327858 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.327865 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.416050 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:13 crc kubenswrapper[4834]: E1126 12:13:13.416157 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.429974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.430084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.430150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.430227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.430284 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.531969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.532000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.532011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.532023 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.532033 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.633951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.633973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.633983 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.633995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.634003 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.736921 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.736951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.736961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.736976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.736986 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.838766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.838805 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.838816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.838830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.838841 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.941215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.941256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.941265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.941281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:13 crc kubenswrapper[4834]: I1126 12:13:13.941292 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:13Z","lastTransitionTime":"2025-11-26T12:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.043386 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.043425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.043437 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.043454 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.043463 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.145435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.145480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.145491 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.145509 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.145530 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.247661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.247723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.247734 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.247755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.247767 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.350035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.350061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.350069 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.350085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.350097 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.416235 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.416248 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.416470 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:14 crc kubenswrapper[4834]: E1126 12:13:14.416395 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:14 crc kubenswrapper[4834]: E1126 12:13:14.416518 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:14 crc kubenswrapper[4834]: E1126 12:13:14.416606 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.452625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.452662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.452672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.452685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.452695 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.555146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.555177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.555211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.555225 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.555235 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.658088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.658133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.658144 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.658159 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.658169 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.760501 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.760539 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.760549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.760561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.760572 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.862964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.863009 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.863022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.863037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.863048 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.965507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.965554 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.965565 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.965585 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:14 crc kubenswrapper[4834]: I1126 12:13:14.965597 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:14Z","lastTransitionTime":"2025-11-26T12:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.067537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.067582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.067592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.067610 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.067624 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.169987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.170018 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.170029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.170042 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.170054 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.272285 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.272333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.272347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.272360 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.272370 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.328810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.328845 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.328855 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.328868 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.328877 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: E1126 12:13:15.339469 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.343416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.343457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.343470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.343486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.343496 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: E1126 12:13:15.354149 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.357635 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.357686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.357711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.357737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.357755 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: E1126 12:13:15.368227 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.370614 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.370643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.370651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.370661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.370670 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: E1126 12:13:15.379027 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.381668 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.381742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.381752 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.381767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.381781 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: E1126 12:13:15.389913 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T12:13:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87769375-2601-4b87-b79e-f69016761287\\\",\\\"systemUUID\\\":\\\"a6b07d33-3cf7-4bfd-b095-28713e624c71\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T12:13:15Z is after 2025-08-24T17:21:41Z" Nov 26 12:13:15 crc kubenswrapper[4834]: E1126 12:13:15.390039 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.391157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.391201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.391213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.391232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.391247 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.416414 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:15 crc kubenswrapper[4834]: E1126 12:13:15.416558 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.493227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.493265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.493278 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.493290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.493301 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.595098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.595133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.595143 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.595162 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.595176 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.696874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.696903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.696912 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.696925 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.696935 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.799291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.799346 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.799356 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.799373 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.799383 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.901422 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.901551 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.901616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.901689 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:15 crc kubenswrapper[4834]: I1126 12:13:15.901751 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:15Z","lastTransitionTime":"2025-11-26T12:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.003340 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.003375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.003386 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.003423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.003435 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.105242 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.105284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.105295 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.105329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.105341 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.207804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.207838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.207848 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.207861 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.207871 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.310286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.310489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.310571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.310647 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.310738 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.413130 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.413169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.413178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.413193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.413203 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.416451 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.416474 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:16 crc kubenswrapper[4834]: E1126 12:13:16.416570 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.416572 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:16 crc kubenswrapper[4834]: E1126 12:13:16.416665 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:16 crc kubenswrapper[4834]: E1126 12:13:16.416758 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.515753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.515903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.515979 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.516039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.516098 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.618553 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.618685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.618747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.618834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.618892 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.720084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.720105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.720112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.720122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.720129 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.822123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.822175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.822187 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.822210 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.822226 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.924358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.924404 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.924413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.924431 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:16 crc kubenswrapper[4834]: I1126 12:13:16.924443 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:16Z","lastTransitionTime":"2025-11-26T12:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.026751 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.026884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.026946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.027015 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.027068 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.128920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.128952 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.128961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.128975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.128984 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.230919 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.230950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.230969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.230982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.230998 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.332440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.332478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.332492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.332509 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.332519 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.416273 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:17 crc kubenswrapper[4834]: E1126 12:13:17.416440 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.434657 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.434749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.434827 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.434898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.434960 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.536447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.536551 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.536635 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.536719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.536772 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.638955 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.639034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.639044 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.639059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.639071 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.741447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.741515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.741529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.741545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.741556 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.843302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.843364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.843373 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.843388 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.843402 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.945414 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.945449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.945457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.945471 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:17 crc kubenswrapper[4834]: I1126 12:13:17.945481 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:17Z","lastTransitionTime":"2025-11-26T12:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.047395 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.047435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.047444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.047459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.047469 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.149379 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.149418 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.149429 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.149444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.149454 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.251745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.251782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.251814 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.251829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.251842 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.353976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.354008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.354018 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.354032 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.354045 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.417486 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.417525 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:18 crc kubenswrapper[4834]: E1126 12:13:18.417616 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.417487 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:18 crc kubenswrapper[4834]: E1126 12:13:18.417696 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:18 crc kubenswrapper[4834]: E1126 12:13:18.417766 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.455112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.455140 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.455150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.455164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.455175 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.557206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.557244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.557256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.557267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.557278 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.660720 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.660798 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.660811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.660841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.660852 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.762595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.762630 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.762639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.762653 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.762661 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.864753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.864798 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.864808 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.864823 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.864834 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.966287 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.966328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.966336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.966348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:18 crc kubenswrapper[4834]: I1126 12:13:18.966356 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:18Z","lastTransitionTime":"2025-11-26T12:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.069817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.069849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.069857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.069871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.069880 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.171942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.171971 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.171992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.172003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.172013 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.274088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.274126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.274136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.274150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.274160 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.376074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.376110 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.376119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.376132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.376140 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.416891 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:19 crc kubenswrapper[4834]: E1126 12:13:19.417212 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.478112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.478142 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.478149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.478159 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.478166 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.580381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.580404 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.580413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.580424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.580431 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.681953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.681983 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.681992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.682004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.682012 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.783681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.783719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.783726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.783741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.783751 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.885592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.885625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.885636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.885650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.885659 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.987099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.987140 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.987152 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.987165 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:19 crc kubenswrapper[4834]: I1126 12:13:19.987174 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:19Z","lastTransitionTime":"2025-11-26T12:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.088454 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.088483 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.088499 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.088510 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.088518 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:20Z","lastTransitionTime":"2025-11-26T12:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.190216 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.190239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.190248 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.190263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.190293 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:20Z","lastTransitionTime":"2025-11-26T12:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.292051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.292124 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.292135 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.292149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.292159 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:20Z","lastTransitionTime":"2025-11-26T12:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.394213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.394261 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.394272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.394283 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.394292 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:20Z","lastTransitionTime":"2025-11-26T12:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.416632 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.416652 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.416698 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:20 crc kubenswrapper[4834]: E1126 12:13:20.416754 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:20 crc kubenswrapper[4834]: E1126 12:13:20.416793 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:20 crc kubenswrapper[4834]: E1126 12:13:20.416858 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.496249 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.496271 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.496279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.496291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.496328 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:20Z","lastTransitionTime":"2025-11-26T12:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.598089 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.598123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.598134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.598146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.598153 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:20Z","lastTransitionTime":"2025-11-26T12:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.700124 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.700157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.700177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.700190 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.700199 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:20Z","lastTransitionTime":"2025-11-26T12:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.801740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.801771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.801779 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.801791 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.801799 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:20Z","lastTransitionTime":"2025-11-26T12:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.902990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.903035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.903085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.903100 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:20 crc kubenswrapper[4834]: I1126 12:13:20.903110 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:20Z","lastTransitionTime":"2025-11-26T12:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.004825 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.004856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.004864 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.004876 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.004884 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.106639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.106680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.106690 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.106709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.106723 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.208504 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.208528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.208536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.208547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.208555 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.311089 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.311126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.311136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.311149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.311159 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.413401 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.413436 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.413444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.413457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.413467 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.416669 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:21 crc kubenswrapper[4834]: E1126 12:13:21.416761 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.515012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.515069 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.515079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.515093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.515104 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.616712 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.616748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.616758 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.616771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.616781 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.718925 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.718957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.718965 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.718976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.718984 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.821190 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.821223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.821233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.821244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.821252 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.923052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.923088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.923096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.923110 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:21 crc kubenswrapper[4834]: I1126 12:13:21.923120 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:21Z","lastTransitionTime":"2025-11-26T12:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.025072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.025107 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.025115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.025128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.025140 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.126816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.126854 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.126864 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.126880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.126889 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.187215 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:22 crc kubenswrapper[4834]: E1126 12:13:22.187359 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:13:22 crc kubenswrapper[4834]: E1126 12:13:22.187408 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs podName:6feada4f-ea0c-4062-ab87-ff88a4590c96 nodeName:}" failed. No retries permitted until 2025-11-26 12:14:26.187394935 +0000 UTC m=+164.094608278 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs") pod "network-metrics-daemon-tmlsw" (UID: "6feada4f-ea0c-4062-ab87-ff88a4590c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.228546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.228574 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.228583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.228594 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.228607 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.330062 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.330101 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.330111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.330126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.330136 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.416434 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.416436 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:22 crc kubenswrapper[4834]: E1126 12:13:22.416522 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.416551 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:22 crc kubenswrapper[4834]: E1126 12:13:22.416622 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:22 crc kubenswrapper[4834]: E1126 12:13:22.416727 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.427515 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5gvrf" podStartSLOduration=78.427506516 podStartE2EDuration="1m18.427506516s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.4274243 +0000 UTC m=+100.334637652" watchObservedRunningTime="2025-11-26 12:13:22.427506516 +0000 UTC m=+100.334719868" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.431392 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.431426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.431435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.431461 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.431477 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.475677 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=27.475658525 podStartE2EDuration="27.475658525s" podCreationTimestamp="2025-11-26 12:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.448641164 +0000 UTC m=+100.355854516" watchObservedRunningTime="2025-11-26 12:13:22.475658525 +0000 UTC m=+100.382871877" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.509907 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.509892059 podStartE2EDuration="51.509892059s" podCreationTimestamp="2025-11-26 12:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.509605186 +0000 UTC m=+100.416818538" watchObservedRunningTime="2025-11-26 12:13:22.509892059 +0000 UTC m=+100.417105411" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.517572 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-br5pb" podStartSLOduration=78.517554166 podStartE2EDuration="1m18.517554166s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.517338428 +0000 UTC m=+100.424551781" watchObservedRunningTime="2025-11-26 12:13:22.517554166 +0000 UTC m=+100.424767518" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.537643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.537685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.537696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.537732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.537758 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.547921 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.547888101 podStartE2EDuration="1m18.547888101s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.535550598 +0000 UTC m=+100.442763950" watchObservedRunningTime="2025-11-26 12:13:22.547888101 +0000 UTC m=+100.455101454" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.560104 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q5s5d" podStartSLOduration=78.560080541 podStartE2EDuration="1m18.560080541s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.559390957 +0000 UTC m=+100.466604309" watchObservedRunningTime="2025-11-26 12:13:22.560080541 +0000 UTC m=+100.467293893" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.583067 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4rwmt" podStartSLOduration=78.583051746 podStartE2EDuration="1m18.583051746s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.58266316 +0000 UTC m=+100.489876512" watchObservedRunningTime="2025-11-26 12:13:22.583051746 +0000 UTC m=+100.490265098" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.592834 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-k8hjt" podStartSLOduration=78.592807204 podStartE2EDuration="1m18.592807204s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.592668492 +0000 UTC m=+100.499881843" watchObservedRunningTime="2025-11-26 12:13:22.592807204 +0000 UTC m=+100.500020556" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.604468 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podStartSLOduration=78.604457548 podStartE2EDuration="1m18.604457548s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.60386633 +0000 UTC m=+100.511079682" watchObservedRunningTime="2025-11-26 12:13:22.604457548 +0000 UTC m=+100.511670900" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.617448 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.617437678 podStartE2EDuration="1m22.617437678s" podCreationTimestamp="2025-11-26 12:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.616445811 +0000 UTC m=+100.523659163" watchObservedRunningTime="2025-11-26 12:13:22.617437678 +0000 UTC m=+100.524651029" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.639677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.639702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.639712 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.639723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.639730 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.639833 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.63981031 podStartE2EDuration="1m16.63981031s" podCreationTimestamp="2025-11-26 12:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:22.630419762 +0000 UTC m=+100.537633114" watchObservedRunningTime="2025-11-26 12:13:22.63981031 +0000 UTC m=+100.547023652" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.741944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.741975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.741983 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.741993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.742002 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.844714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.844739 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.844748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.844759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.844767 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.946900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.946945 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.946955 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.946972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:22 crc kubenswrapper[4834]: I1126 12:13:22.946983 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:22Z","lastTransitionTime":"2025-11-26T12:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.049520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.049564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.049574 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.049590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.049601 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.152735 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.152772 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.152783 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.152794 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.152803 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.255383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.255430 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.255440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.255456 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.255472 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.357848 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.357888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.357897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.357913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.357924 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.416415 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:23 crc kubenswrapper[4834]: E1126 12:13:23.416535 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.417017 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:13:23 crc kubenswrapper[4834]: E1126 12:13:23.417160 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9dvt4_openshift-ovn-kubernetes(e7f44620-97b4-4cdb-8252-d8a2971830fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.460241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.460279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.460291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.460327 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.460339 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.562226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.562263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.562274 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.562288 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.562299 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.664800 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.665013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.665089 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.665167 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.665230 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.767581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.767710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.767775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.767835 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.767898 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.868970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.869115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.869193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.869262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.869343 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.971470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.971610 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.971856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.971927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:23 crc kubenswrapper[4834]: I1126 12:13:23.971985 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:23Z","lastTransitionTime":"2025-11-26T12:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.074024 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.074070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.074081 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.074093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.074102 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.175838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.175870 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.175879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.175890 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.175899 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.278011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.278040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.278049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.278073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.278082 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.379755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.379794 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.379803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.379821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.379832 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.416861 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:24 crc kubenswrapper[4834]: E1126 12:13:24.416969 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.417130 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:24 crc kubenswrapper[4834]: E1126 12:13:24.417271 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.417305 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:24 crc kubenswrapper[4834]: E1126 12:13:24.417918 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.481340 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.481375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.481384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.481398 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.481407 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.583023 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.583166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.583225 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.583285 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.583358 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.684642 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.684677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.684687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.684701 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.684712 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.785965 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.785996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.786004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.786020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.786029 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.887652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.887688 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.887698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.887709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.887718 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.989595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.989651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.989659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.989687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:24 crc kubenswrapper[4834]: I1126 12:13:24.989701 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:24Z","lastTransitionTime":"2025-11-26T12:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.091409 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.091442 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.091453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.091465 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.091472 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:25Z","lastTransitionTime":"2025-11-26T12:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.193600 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.193637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.193646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.193662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.193672 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:25Z","lastTransitionTime":"2025-11-26T12:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.295439 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.295482 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.295491 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.295507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.295517 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:25Z","lastTransitionTime":"2025-11-26T12:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.397334 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.397374 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.397386 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.397401 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.397411 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:25Z","lastTransitionTime":"2025-11-26T12:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.398332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.398366 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.398376 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.398388 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.398397 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T12:13:25Z","lastTransitionTime":"2025-11-26T12:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.416962 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:25 crc kubenswrapper[4834]: E1126 12:13:25.417058 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.431619 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf"] Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.431931 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.433179 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.433455 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.434131 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.434424 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.615185 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.615231 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.615274 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.615305 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.615478 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.716251 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.716291 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.716350 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.716370 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.716387 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.716501 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.716573 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.717115 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.721138 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.729433 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/906e9982-d10d-4bd5-bf1a-8e946c72f6a0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gfzbf\" (UID: \"906e9982-d10d-4bd5-bf1a-8e946c72f6a0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.745200 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.849760 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" event={"ID":"906e9982-d10d-4bd5-bf1a-8e946c72f6a0","Type":"ContainerStarted","Data":"7110b441c570819f58a269d65267399b3b2fd8121de0b06869c79313c996e9fc"} Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.849802 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" event={"ID":"906e9982-d10d-4bd5-bf1a-8e946c72f6a0","Type":"ContainerStarted","Data":"b0b7c514428a93401b4f198a8e0022104b20d381687087b0dfd3a17becfe55ad"} Nov 26 12:13:25 crc kubenswrapper[4834]: I1126 12:13:25.860079 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gfzbf" podStartSLOduration=81.860049633 podStartE2EDuration="1m21.860049633s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:25.859439199 +0000 UTC m=+103.766652551" watchObservedRunningTime="2025-11-26 12:13:25.860049633 +0000 UTC m=+103.767262986" Nov 26 12:13:26 crc kubenswrapper[4834]: I1126 12:13:26.416044 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:26 crc kubenswrapper[4834]: I1126 12:13:26.416150 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:26 crc kubenswrapper[4834]: I1126 12:13:26.416326 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:26 crc kubenswrapper[4834]: E1126 12:13:26.416537 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:26 crc kubenswrapper[4834]: E1126 12:13:26.416399 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:26 crc kubenswrapper[4834]: E1126 12:13:26.416761 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:27 crc kubenswrapper[4834]: I1126 12:13:27.416427 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:27 crc kubenswrapper[4834]: E1126 12:13:27.416550 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:28 crc kubenswrapper[4834]: I1126 12:13:28.418908 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:28 crc kubenswrapper[4834]: I1126 12:13:28.419073 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:28 crc kubenswrapper[4834]: I1126 12:13:28.419259 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:28 crc kubenswrapper[4834]: E1126 12:13:28.419920 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:28 crc kubenswrapper[4834]: E1126 12:13:28.419928 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:28 crc kubenswrapper[4834]: E1126 12:13:28.419961 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:29 crc kubenswrapper[4834]: I1126 12:13:29.416515 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:29 crc kubenswrapper[4834]: E1126 12:13:29.416610 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:30 crc kubenswrapper[4834]: I1126 12:13:30.416286 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:30 crc kubenswrapper[4834]: I1126 12:13:30.416326 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:30 crc kubenswrapper[4834]: I1126 12:13:30.416355 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:30 crc kubenswrapper[4834]: E1126 12:13:30.416416 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:30 crc kubenswrapper[4834]: E1126 12:13:30.416473 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:30 crc kubenswrapper[4834]: E1126 12:13:30.416609 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:31 crc kubenswrapper[4834]: I1126 12:13:31.416949 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:31 crc kubenswrapper[4834]: E1126 12:13:31.417063 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:32 crc kubenswrapper[4834]: I1126 12:13:32.417086 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:32 crc kubenswrapper[4834]: I1126 12:13:32.417154 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:32 crc kubenswrapper[4834]: E1126 12:13:32.418026 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:32 crc kubenswrapper[4834]: I1126 12:13:32.418047 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:32 crc kubenswrapper[4834]: E1126 12:13:32.418143 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:32 crc kubenswrapper[4834]: E1126 12:13:32.418375 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:33 crc kubenswrapper[4834]: I1126 12:13:33.416680 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:33 crc kubenswrapper[4834]: E1126 12:13:33.416792 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:34 crc kubenswrapper[4834]: I1126 12:13:34.416613 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:34 crc kubenswrapper[4834]: I1126 12:13:34.416679 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:34 crc kubenswrapper[4834]: I1126 12:13:34.416636 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:34 crc kubenswrapper[4834]: E1126 12:13:34.416779 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:34 crc kubenswrapper[4834]: E1126 12:13:34.416852 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:34 crc kubenswrapper[4834]: E1126 12:13:34.417024 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:35 crc kubenswrapper[4834]: I1126 12:13:35.416199 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:35 crc kubenswrapper[4834]: E1126 12:13:35.416448 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:36 crc kubenswrapper[4834]: I1126 12:13:36.416597 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:36 crc kubenswrapper[4834]: I1126 12:13:36.416689 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:36 crc kubenswrapper[4834]: I1126 12:13:36.416984 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:36 crc kubenswrapper[4834]: E1126 12:13:36.416970 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:36 crc kubenswrapper[4834]: E1126 12:13:36.417153 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:36 crc kubenswrapper[4834]: E1126 12:13:36.417635 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:36 crc kubenswrapper[4834]: I1126 12:13:36.417956 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:13:36 crc kubenswrapper[4834]: I1126 12:13:36.874384 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/3.log" Nov 26 12:13:36 crc kubenswrapper[4834]: I1126 12:13:36.876854 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerStarted","Data":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} Nov 26 12:13:36 crc kubenswrapper[4834]: I1126 12:13:36.877212 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:13:36 crc kubenswrapper[4834]: I1126 12:13:36.896516 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podStartSLOduration=92.896500417 podStartE2EDuration="1m32.896500417s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:36.895690835 +0000 UTC m=+114.802904187" watchObservedRunningTime="2025-11-26 12:13:36.896500417 +0000 UTC m=+114.803713769" Nov 26 12:13:37 crc kubenswrapper[4834]: I1126 12:13:37.093092 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tmlsw"] Nov 26 12:13:37 crc kubenswrapper[4834]: I1126 12:13:37.093392 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:37 crc kubenswrapper[4834]: E1126 12:13:37.093577 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:37 crc kubenswrapper[4834]: I1126 12:13:37.416615 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:37 crc kubenswrapper[4834]: E1126 12:13:37.416756 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:38 crc kubenswrapper[4834]: I1126 12:13:38.416948 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:38 crc kubenswrapper[4834]: I1126 12:13:38.417050 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:38 crc kubenswrapper[4834]: E1126 12:13:38.417153 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 12:13:38 crc kubenswrapper[4834]: E1126 12:13:38.417294 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.416517 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:39 crc kubenswrapper[4834]: E1126 12:13:39.416908 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.416534 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:39 crc kubenswrapper[4834]: E1126 12:13:39.417061 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tmlsw" podUID="6feada4f-ea0c-4062-ab87-ff88a4590c96" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.508599 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.508684 4834 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.534829 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.535213 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.538384 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hvfh2"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.538656 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.538708 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.539014 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.539055 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.539161 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.539022 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vctqw"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.539330 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.539435 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jztgs"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.539563 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.540164 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.540204 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.540538 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.542251 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.542388 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.542503 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.542508 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.542625 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.555617 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.559132 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.559578 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.561513 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.561565 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.561530 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.561703 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.561873 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.561965 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.562073 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.562393 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.561923 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.562793 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.562808 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.563577 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.563733 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.563882 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.564147 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.564275 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.564293 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.564387 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.564520 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.564549 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rt9fj"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.565034 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.565269 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w4l9d"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.565407 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.565621 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.565754 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.567272 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.567923 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.568152 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.571717 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-625th"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.572181 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.572386 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.572710 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573207 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573454 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573484 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573542 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573585 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573614 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573485 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573604 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573731 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573743 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573753 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573793 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573821 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573834 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573850 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.573860 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.574140 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8j9fr"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.574332 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.574438 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.574472 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.574573 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.574599 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.574765 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.574931 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.574974 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.575178 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9cvdp"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.575340 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.575401 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9rfmg"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.575490 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.575605 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-z2dpf"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.575744 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.575773 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.575843 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.575867 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.576080 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.576127 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.576078 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.578194 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9rfmg" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.579007 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.579250 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.579959 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.580229 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tdqww"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.584834 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.592997 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.593085 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.593129 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.593799 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.598273 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.608645 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.608772 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.609029 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.613070 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.614219 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.613674 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.614421 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.614342 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.614710 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tlgd5"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.613411 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.614915 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.615001 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.615117 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617469 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617552 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617627 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617655 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617631 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617753 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617866 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617885 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617916 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617995 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.617997 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.618079 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.618050 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.618098 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.618175 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.618302 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.618418 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.618588 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stt44"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.618657 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.618880 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.619294 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.619663 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcv95"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.619723 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.619803 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.619965 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.620140 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.620260 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.620613 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.621206 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.621359 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.621481 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.621588 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.621674 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.621692 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.621750 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.621834 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.622070 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.622267 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.622425 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.623733 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.624117 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.624188 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.624478 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.624788 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.624993 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.625022 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r462f"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.625151 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.625403 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.625699 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.626057 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.626341 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.626887 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.627530 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.627579 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.627992 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.628175 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.628762 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.638352 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.638808 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.640825 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bvjpq"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.641437 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcfca33b-1c7d-46ff-8f91-374eddf7b5b3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9w45j\" (UID: \"bcfca33b-1c7d-46ff-8f91-374eddf7b5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.641474 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5333eaad-5273-4b67-9876-a471945dbb76-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k6bw9\" (UID: \"5333eaad-5273-4b67-9876-a471945dbb76\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.641509 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-encryption-config\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.641691 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7893c5da-bc30-4072-a95b-01adc121b50b-config\") pod \"service-ca-operator-777779d784-xhdnp\" (UID: \"7893c5da-bc30-4072-a95b-01adc121b50b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.641743 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g7v4\" (UniqueName: \"kubernetes.io/projected/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-kube-api-access-7g7v4\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.641880 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.641910 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bflc9\" (UniqueName: \"kubernetes.io/projected/243172e8-801d-435a-a091-c436cd119f1f-kube-api-access-bflc9\") pod \"openshift-config-operator-7777fb866f-c5m8m\" (UID: \"243172e8-801d-435a-a091-c436cd119f1f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.641926 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/243172e8-801d-435a-a091-c436cd119f1f-serving-cert\") pod \"openshift-config-operator-7777fb866f-c5m8m\" (UID: \"243172e8-801d-435a-a091-c436cd119f1f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.642096 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5333eaad-5273-4b67-9876-a471945dbb76-config\") pod \"kube-controller-manager-operator-78b949d7b-k6bw9\" (UID: \"5333eaad-5273-4b67-9876-a471945dbb76\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.642191 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn822\" (UniqueName: \"kubernetes.io/projected/7893c5da-bc30-4072-a95b-01adc121b50b-kube-api-access-nn822\") pod \"service-ca-operator-777779d784-xhdnp\" (UID: \"7893c5da-bc30-4072-a95b-01adc121b50b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.642221 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/243172e8-801d-435a-a091-c436cd119f1f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-c5m8m\" (UID: \"243172e8-801d-435a-a091-c436cd119f1f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.642248 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00b79e1c-4a38-43c4-81b0-647cbc8f3acf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hpfzc\" (UID: \"00b79e1c-4a38-43c4-81b0-647cbc8f3acf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.642712 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-audit-dir\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.642852 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6d59\" (UniqueName: \"kubernetes.io/projected/bcfca33b-1c7d-46ff-8f91-374eddf7b5b3-kube-api-access-x6d59\") pod \"cluster-samples-operator-665b6dd947-9w45j\" (UID: \"bcfca33b-1c7d-46ff-8f91-374eddf7b5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.642873 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-audit-policies\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643004 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00b79e1c-4a38-43c4-81b0-647cbc8f3acf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hpfzc\" (UID: \"00b79e1c-4a38-43c4-81b0-647cbc8f3acf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643090 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643038 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643154 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-etcd-client\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643299 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7893c5da-bc30-4072-a95b-01adc121b50b-serving-cert\") pod \"service-ca-operator-777779d784-xhdnp\" (UID: \"7893c5da-bc30-4072-a95b-01adc121b50b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643342 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643514 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw7ll\" (UniqueName: \"kubernetes.io/projected/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-kube-api-access-gw7ll\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643535 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-apiservice-cert\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643597 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5333eaad-5273-4b67-9876-a471945dbb76-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k6bw9\" (UID: \"5333eaad-5273-4b67-9876-a471945dbb76\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643616 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-tmpfs\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643636 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-serving-cert\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643657 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-images\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643680 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643694 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-proxy-tls\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643792 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnrgv\" (UniqueName: \"kubernetes.io/projected/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-kube-api-access-mnrgv\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643815 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rbms\" (UniqueName: \"kubernetes.io/projected/00b79e1c-4a38-43c4-81b0-647cbc8f3acf-kube-api-access-7rbms\") pod \"openshift-apiserver-operator-796bbdcf4f-hpfzc\" (UID: \"00b79e1c-4a38-43c4-81b0-647cbc8f3acf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643887 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-webhook-cert\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.643929 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.676386 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vctqw"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.678973 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.679232 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.680104 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hvfh2"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.681037 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rt9fj"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.681771 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-625th"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.682532 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pss4j"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.683685 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.683792 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.684199 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.684944 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.685721 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.686526 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tdqww"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.687365 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.688759 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.688925 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.689716 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.690671 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w4l9d"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.691989 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.695039 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8j9fr"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.695074 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.695086 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-z2dpf"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.696272 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jztgs"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.697146 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.698705 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.700019 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pss4j"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.701056 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.701236 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.701771 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.702041 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tlgd5"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.702419 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.702982 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.703974 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stt44"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.704597 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.705431 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcv95"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.706465 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9cvdp"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.707748 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.707772 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-n7k2p"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.708458 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.708611 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9rfmg"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.709400 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qf9rv"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.709867 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qf9rv" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.710214 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.711364 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.712718 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.713864 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bvjpq"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.714869 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qf9rv"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.717575 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.736654 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744596 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcfca33b-1c7d-46ff-8f91-374eddf7b5b3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9w45j\" (UID: \"bcfca33b-1c7d-46ff-8f91-374eddf7b5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744628 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5333eaad-5273-4b67-9876-a471945dbb76-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k6bw9\" (UID: \"5333eaad-5273-4b67-9876-a471945dbb76\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744662 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-encryption-config\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744679 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7893c5da-bc30-4072-a95b-01adc121b50b-config\") pod \"service-ca-operator-777779d784-xhdnp\" (UID: \"7893c5da-bc30-4072-a95b-01adc121b50b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744707 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g7v4\" (UniqueName: \"kubernetes.io/projected/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-kube-api-access-7g7v4\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744726 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744754 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bflc9\" (UniqueName: \"kubernetes.io/projected/243172e8-801d-435a-a091-c436cd119f1f-kube-api-access-bflc9\") pod \"openshift-config-operator-7777fb866f-c5m8m\" (UID: \"243172e8-801d-435a-a091-c436cd119f1f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744772 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/243172e8-801d-435a-a091-c436cd119f1f-serving-cert\") pod \"openshift-config-operator-7777fb866f-c5m8m\" (UID: \"243172e8-801d-435a-a091-c436cd119f1f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744813 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5333eaad-5273-4b67-9876-a471945dbb76-config\") pod \"kube-controller-manager-operator-78b949d7b-k6bw9\" (UID: \"5333eaad-5273-4b67-9876-a471945dbb76\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744831 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn822\" (UniqueName: \"kubernetes.io/projected/7893c5da-bc30-4072-a95b-01adc121b50b-kube-api-access-nn822\") pod \"service-ca-operator-777779d784-xhdnp\" (UID: \"7893c5da-bc30-4072-a95b-01adc121b50b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744862 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/243172e8-801d-435a-a091-c436cd119f1f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-c5m8m\" (UID: \"243172e8-801d-435a-a091-c436cd119f1f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744885 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00b79e1c-4a38-43c4-81b0-647cbc8f3acf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hpfzc\" (UID: \"00b79e1c-4a38-43c4-81b0-647cbc8f3acf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744922 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-audit-dir\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744938 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6d59\" (UniqueName: \"kubernetes.io/projected/bcfca33b-1c7d-46ff-8f91-374eddf7b5b3-kube-api-access-x6d59\") pod \"cluster-samples-operator-665b6dd947-9w45j\" (UID: \"bcfca33b-1c7d-46ff-8f91-374eddf7b5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744960 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-audit-policies\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744974 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00b79e1c-4a38-43c4-81b0-647cbc8f3acf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hpfzc\" (UID: \"00b79e1c-4a38-43c4-81b0-647cbc8f3acf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.744992 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745005 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-etcd-client\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745020 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7893c5da-bc30-4072-a95b-01adc121b50b-serving-cert\") pod \"service-ca-operator-777779d784-xhdnp\" (UID: \"7893c5da-bc30-4072-a95b-01adc121b50b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745035 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745053 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw7ll\" (UniqueName: \"kubernetes.io/projected/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-kube-api-access-gw7ll\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745076 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-apiservice-cert\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745093 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5333eaad-5273-4b67-9876-a471945dbb76-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k6bw9\" (UID: \"5333eaad-5273-4b67-9876-a471945dbb76\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745109 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-tmpfs\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745124 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-serving-cert\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745138 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-images\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745160 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-proxy-tls\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745188 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnrgv\" (UniqueName: \"kubernetes.io/projected/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-kube-api-access-mnrgv\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745203 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rbms\" (UniqueName: \"kubernetes.io/projected/00b79e1c-4a38-43c4-81b0-647cbc8f3acf-kube-api-access-7rbms\") pod \"openshift-apiserver-operator-796bbdcf4f-hpfzc\" (UID: \"00b79e1c-4a38-43c4-81b0-647cbc8f3acf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.745219 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-webhook-cert\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.746886 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-images\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.746994 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.748196 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.748941 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.749338 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5333eaad-5273-4b67-9876-a471945dbb76-config\") pod \"kube-controller-manager-operator-78b949d7b-k6bw9\" (UID: \"5333eaad-5273-4b67-9876-a471945dbb76\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.750070 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-audit-dir\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.750097 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/243172e8-801d-435a-a091-c436cd119f1f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-c5m8m\" (UID: \"243172e8-801d-435a-a091-c436cd119f1f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.750654 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00b79e1c-4a38-43c4-81b0-647cbc8f3acf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hpfzc\" (UID: \"00b79e1c-4a38-43c4-81b0-647cbc8f3acf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.752993 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-encryption-config\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.753282 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-tmpfs\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.753303 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcfca33b-1c7d-46ff-8f91-374eddf7b5b3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9w45j\" (UID: \"bcfca33b-1c7d-46ff-8f91-374eddf7b5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.754448 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-audit-policies\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.755216 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-proxy-tls\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.755570 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00b79e1c-4a38-43c4-81b0-647cbc8f3acf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hpfzc\" (UID: \"00b79e1c-4a38-43c4-81b0-647cbc8f3acf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.756440 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-etcd-client\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.756452 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5333eaad-5273-4b67-9876-a471945dbb76-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k6bw9\" (UID: \"5333eaad-5273-4b67-9876-a471945dbb76\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.758191 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-serving-cert\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.758853 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.761542 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/243172e8-801d-435a-a091-c436cd119f1f-serving-cert\") pod \"openshift-config-operator-7777fb866f-c5m8m\" (UID: \"243172e8-801d-435a-a091-c436cd119f1f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.761627 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ztdb4"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.762347 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.766325 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ztdb4"] Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.776842 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.797757 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.817229 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.837407 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.856514 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.883227 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.896956 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.917041 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.937557 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.958495 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.977343 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 26 12:13:39 crc kubenswrapper[4834]: I1126 12:13:39.998134 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.016789 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.038028 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.057168 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.077286 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.097876 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.116791 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.136485 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.157135 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.163787 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-apiservice-cert\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.167944 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-webhook-cert\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.176892 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.197065 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.217199 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.237081 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.256580 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.277605 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.297379 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.317151 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.336966 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.356834 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.377377 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.397155 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.416714 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.417015 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.423681 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.437155 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.457222 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.477306 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.496874 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.517547 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.536816 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.557269 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.577535 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.601070 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.616598 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.636071 4834 request.go:700] Waited for 1.01559756s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.637529 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.657578 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.676737 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.697509 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.706083 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7893c5da-bc30-4072-a95b-01adc121b50b-config\") pod \"service-ca-operator-777779d784-xhdnp\" (UID: \"7893c5da-bc30-4072-a95b-01adc121b50b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.717168 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.736439 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.742820 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7893c5da-bc30-4072-a95b-01adc121b50b-serving-cert\") pod \"service-ca-operator-777779d784-xhdnp\" (UID: \"7893c5da-bc30-4072-a95b-01adc121b50b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.757035 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.776726 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.798547 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.816709 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.838218 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.857531 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.877532 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.896971 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.917664 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.937584 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.957107 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.977576 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 26 12:13:40 crc kubenswrapper[4834]: I1126 12:13:40.997325 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.018035 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.037130 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.057523 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.097452 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.117899 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.137127 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.157340 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.177478 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.196635 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.237414 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.257100 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.277254 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.297329 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.317393 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.337605 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.356741 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.376870 4834 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.398388 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.416084 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.416094 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.417375 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.436935 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.457743 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.477089 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.497270 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.517596 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.549678 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g7v4\" (UniqueName: \"kubernetes.io/projected/d1d6e1fe-3809-4973-bac7-6bdd60d74d74-kube-api-access-7g7v4\") pod \"machine-config-operator-74547568cd-zl8n8\" (UID: \"d1d6e1fe-3809-4973-bac7-6bdd60d74d74\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.568739 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5333eaad-5273-4b67-9876-a471945dbb76-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k6bw9\" (UID: \"5333eaad-5273-4b67-9876-a471945dbb76\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.582618 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.608129 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bflc9\" (UniqueName: \"kubernetes.io/projected/243172e8-801d-435a-a091-c436cd119f1f-kube-api-access-bflc9\") pod \"openshift-config-operator-7777fb866f-c5m8m\" (UID: \"243172e8-801d-435a-a091-c436cd119f1f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.619743 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnrgv\" (UniqueName: \"kubernetes.io/projected/ea236e0f-7859-40a9-b92d-a0a54ce9d6f1-kube-api-access-mnrgv\") pod \"packageserver-d55dfcdfc-l74fz\" (UID: \"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.629789 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rbms\" (UniqueName: \"kubernetes.io/projected/00b79e1c-4a38-43c4-81b0-647cbc8f3acf-kube-api-access-7rbms\") pod \"openshift-apiserver-operator-796bbdcf4f-hpfzc\" (UID: \"00b79e1c-4a38-43c4-81b0-647cbc8f3acf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.648761 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn822\" (UniqueName: \"kubernetes.io/projected/7893c5da-bc30-4072-a95b-01adc121b50b-kube-api-access-nn822\") pod \"service-ca-operator-777779d784-xhdnp\" (UID: \"7893c5da-bc30-4072-a95b-01adc121b50b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.652482 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.656430 4834 request.go:700] Waited for 1.906755778s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.671073 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6d59\" (UniqueName: \"kubernetes.io/projected/bcfca33b-1c7d-46ff-8f91-374eddf7b5b3-kube-api-access-x6d59\") pod \"cluster-samples-operator-665b6dd947-9w45j\" (UID: \"bcfca33b-1c7d-46ff-8f91-374eddf7b5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.691732 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw7ll\" (UniqueName: \"kubernetes.io/projected/c4585ceb-bed5-4ab6-917e-15778b7a0f4d-kube-api-access-gw7ll\") pod \"apiserver-7bbb656c7d-625th\" (UID: \"c4585ceb-bed5-4ab6-917e-15778b7a0f4d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.698224 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.718692 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.731403 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8"] Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.737504 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.741565 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" Nov 26 12:13:41 crc kubenswrapper[4834]: W1126 12:13:41.742506 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d6e1fe_3809_4973_bac7_6bdd60d74d74.slice/crio-d39e98b6014063fa19cc5a489c6239f4a500a6f4e641c7cd11423f0c5f9bfba2 WatchSource:0}: Error finding container d39e98b6014063fa19cc5a489c6239f4a500a6f4e641c7cd11423f0c5f9bfba2: Status 404 returned error can't find the container with id d39e98b6014063fa19cc5a489c6239f4a500a6f4e641c7cd11423f0c5f9bfba2 Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.757211 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.777264 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.785613 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp"] Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.797792 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.804515 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.813016 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.817125 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.825695 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.836637 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.866978 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw47q\" (UniqueName: \"kubernetes.io/projected/4e332248-0575-4104-a6af-888110da84c0-kube-api-access-hw47q\") pod \"migrator-59844c95c7-bq2xl\" (UID: \"4e332248-0575-4104-a6af-888110da84c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867016 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-config\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867035 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tm6g\" (UniqueName: \"kubernetes.io/projected/17225c8b-a6dd-4958-b1be-58b3cc4ad317-kube-api-access-7tm6g\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867053 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867070 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-image-import-ca\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867087 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262058ab-c269-4132-84e9-a244944bcc0d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtjlx\" (UID: \"262058ab-c269-4132-84e9-a244944bcc0d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867106 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6mz2\" (UniqueName: \"kubernetes.io/projected/0d700278-027d-44e8-b8e3-c262fc0a6b44-kube-api-access-p6mz2\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867124 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867143 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867159 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-trusted-ca\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867174 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9scc\" (UniqueName: \"kubernetes.io/projected/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-kube-api-access-b9scc\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867201 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/280cee19-adbf-4307-ac10-337b76f6b6d1-config-volume\") pod \"collect-profiles-29402640-25hcz\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867217 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rds48\" (UniqueName: \"kubernetes.io/projected/6c485dff-89c6-4d40-8ba4-3b69ac68820e-kube-api-access-rds48\") pod \"marketplace-operator-79b997595-pcv95\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867232 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867249 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867268 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r97rn\" (UniqueName: \"kubernetes.io/projected/262058ab-c269-4132-84e9-a244944bcc0d-kube-api-access-r97rn\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtjlx\" (UID: \"262058ab-c269-4132-84e9-a244944bcc0d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867286 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-config\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867303 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-service-ca-bundle\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867336 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2fa5531-03be-4051-968e-a3b00820266e-audit-dir\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867350 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f66f2\" (UniqueName: \"kubernetes.io/projected/c2fa5531-03be-4051-968e-a3b00820266e-kube-api-access-f66f2\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867363 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/262058ab-c269-4132-84e9-a244944bcc0d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtjlx\" (UID: \"262058ab-c269-4132-84e9-a244944bcc0d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867377 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867394 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-serving-cert\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867410 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867423 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/280cee19-adbf-4307-ac10-337b76f6b6d1-secret-volume\") pod \"collect-profiles-29402640-25hcz\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867437 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qxv\" (UniqueName: \"kubernetes.io/projected/a5dda160-6a44-4f03-b9a4-9baeeae03a54-kube-api-access-56qxv\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867455 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a53f903b-5b01-467d-be5c-28aabf3dc48b-machine-approver-tls\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867481 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns569\" (UniqueName: \"kubernetes.io/projected/b34fa680-13b0-492a-a80d-c2ef0709efb4-kube-api-access-ns569\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867506 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-metrics-tls\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867521 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-tls\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867534 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/60f10e1e-0000-46ee-9c26-d79f680a79df-encryption-config\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867551 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb2n2\" (UniqueName: \"kubernetes.io/projected/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-kube-api-access-mb2n2\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867568 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867583 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867598 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-oauth-serving-cert\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867613 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-metrics-certs\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867630 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1ea3730-6a71-4f4c-8588-7e39021bdb3d-proxy-tls\") pod \"machine-config-controller-84d6567774-dhx2d\" (UID: \"a1ea3730-6a71-4f4c-8588-7e39021bdb3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867644 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pcv95\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867660 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c425edda-322b-4b12-bf02-48ceeeed9a59-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867678 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-serving-cert\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867691 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-default-certificate\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867710 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5dda160-6a44-4f03-b9a4-9baeeae03a54-serving-cert\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867727 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d700278-027d-44e8-b8e3-c262fc0a6b44-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867744 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b34fa680-13b0-492a-a80d-c2ef0709efb4-serving-cert\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867758 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-etcd-serving-ca\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867775 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-trusted-ca\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867791 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/059853c5-abf6-459b-9719-cc6688fd6e0a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tlgd5\" (UID: \"059853c5-abf6-459b-9719-cc6688fd6e0a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867804 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a53f903b-5b01-467d-be5c-28aabf3dc48b-auth-proxy-config\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867820 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b34fa680-13b0-492a-a80d-c2ef0709efb4-trusted-ca\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867835 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867862 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c425edda-322b-4b12-bf02-48ceeeed9a59-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867876 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867891 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nztgq\" (UniqueName: \"kubernetes.io/projected/0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7-kube-api-access-nztgq\") pod \"kube-storage-version-migrator-operator-b67b599dd-j2jb2\" (UID: \"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867907 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f95p4\" (UniqueName: \"kubernetes.io/projected/2d5dd7f9-7abf-4c5e-ad19-96e010400267-kube-api-access-f95p4\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dhk4\" (UID: \"2d5dd7f9-7abf-4c5e-ad19-96e010400267\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867921 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867935 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pcv95\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867952 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5cfef378-bc85-4056-bccd-fbfec5535325-metrics-tls\") pod \"dns-operator-744455d44c-8j9fr\" (UID: \"5cfef378-bc85-4056-bccd-fbfec5535325\") " pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867968 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-stats-auth\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867982 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0d700278-027d-44e8-b8e3-c262fc0a6b44-images\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.867997 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-client-ca\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868012 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868026 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-service-ca\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868040 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-bound-sa-token\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868057 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-j2jb2\" (UID: \"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868079 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60f10e1e-0000-46ee-9c26-d79f680a79df-audit-dir\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868093 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868117 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkr44\" (UniqueName: \"kubernetes.io/projected/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-kube-api-access-dkr44\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868133 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qls4f\" (UniqueName: \"kubernetes.io/projected/34801d76-ed04-44db-a572-a7a43067159d-kube-api-access-qls4f\") pod \"downloads-7954f5f757-9rfmg\" (UID: \"34801d76-ed04-44db-a572-a7a43067159d\") " pod="openshift-console/downloads-7954f5f757-9rfmg" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868151 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f1fa120-0514-437c-9a02-2b3a0b9c9050-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-28tv5\" (UID: \"8f1fa120-0514-437c-9a02-2b3a0b9c9050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868166 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-audit\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868182 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6470256-c870-4267-8e33-542f83d4f07c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mg6qr\" (UID: \"e6470256-c870-4267-8e33-542f83d4f07c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868215 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6t7\" (UniqueName: \"kubernetes.io/projected/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-kube-api-access-kk6t7\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868233 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1fa120-0514-437c-9a02-2b3a0b9c9050-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-28tv5\" (UID: \"8f1fa120-0514-437c-9a02-2b3a0b9c9050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868247 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-config\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868261 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-etcd-ca\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868276 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6470256-c870-4267-8e33-542f83d4f07c-config\") pod \"kube-apiserver-operator-766d6c64bb-mg6qr\" (UID: \"e6470256-c870-4267-8e33-542f83d4f07c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868290 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-trusted-ca-bundle\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868319 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/60f10e1e-0000-46ee-9c26-d79f680a79df-etcd-client\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868335 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48hv5\" (UniqueName: \"kubernetes.io/projected/c425edda-322b-4b12-bf02-48ceeeed9a59-kube-api-access-48hv5\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868352 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d700278-027d-44e8-b8e3-c262fc0a6b44-config\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868368 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8njb\" (UniqueName: \"kubernetes.io/projected/60f10e1e-0000-46ee-9c26-d79f680a79df-kube-api-access-c8njb\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868383 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vcj6\" (UniqueName: \"kubernetes.io/projected/280cee19-adbf-4307-ac10-337b76f6b6d1-kube-api-access-4vcj6\") pod \"collect-profiles-29402640-25hcz\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868398 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f1fa120-0514-437c-9a02-2b3a0b9c9050-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-28tv5\" (UID: \"8f1fa120-0514-437c-9a02-2b3a0b9c9050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868413 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34fa680-13b0-492a-a80d-c2ef0709efb4-config\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868427 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-etcd-service-ca\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868445 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8rhm\" (UniqueName: \"kubernetes.io/projected/a53f903b-5b01-467d-be5c-28aabf3dc48b-kube-api-access-g8rhm\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868461 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-config\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868479 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d5dd7f9-7abf-4c5e-ad19-96e010400267-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dhk4\" (UID: \"2d5dd7f9-7abf-4c5e-ad19-96e010400267\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868494 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-serving-cert\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868509 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868524 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6470256-c870-4267-8e33-542f83d4f07c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mg6qr\" (UID: \"e6470256-c870-4267-8e33-542f83d4f07c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868539 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a53f903b-5b01-467d-be5c-28aabf3dc48b-config\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868554 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsm7g\" (UniqueName: \"kubernetes.io/projected/5cfef378-bc85-4056-bccd-fbfec5535325-kube-api-access-xsm7g\") pod \"dns-operator-744455d44c-8j9fr\" (UID: \"5cfef378-bc85-4056-bccd-fbfec5535325\") " pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868570 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-audit-policies\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868586 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-service-ca-bundle\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868603 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f10e1e-0000-46ee-9c26-d79f680a79df-serving-cert\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868621 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-certificates\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868636 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-etcd-client\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868652 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jxsz\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-kube-api-access-7jxsz\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868667 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-config\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868682 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-config\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868696 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-oauth-config\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868713 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gs6v\" (UniqueName: \"kubernetes.io/projected/a1ea3730-6a71-4f4c-8588-7e39021bdb3d-kube-api-access-7gs6v\") pod \"machine-config-controller-84d6567774-dhx2d\" (UID: \"a1ea3730-6a71-4f4c-8588-7e39021bdb3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868727 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/60f10e1e-0000-46ee-9c26-d79f680a79df-node-pullsecrets\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868750 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868765 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868781 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nx9t\" (UniqueName: \"kubernetes.io/projected/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-kube-api-access-4nx9t\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868796 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c425edda-322b-4b12-bf02-48ceeeed9a59-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868812 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-j2jb2\" (UID: \"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868828 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a1ea3730-6a71-4f4c-8588-7e39021bdb3d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dhx2d\" (UID: \"a1ea3730-6a71-4f4c-8588-7e39021bdb3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868843 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-client-ca\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868859 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-serving-cert\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868873 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfcjf\" (UniqueName: \"kubernetes.io/projected/059853c5-abf6-459b-9719-cc6688fd6e0a-kube-api-access-hfcjf\") pod \"multus-admission-controller-857f4d67dd-tlgd5\" (UID: \"059853c5-abf6-459b-9719-cc6688fd6e0a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.868890 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: E1126 12:13:41.870262 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:42.370247045 +0000 UTC m=+120.277460398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.873127 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc"] Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.877463 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 26 12:13:41 crc kubenswrapper[4834]: W1126 12:13:41.883425 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b79e1c_4a38_43c4_81b0_647cbc8f3acf.slice/crio-8e38eb6bfaed2122010b2c853d68d72f83c0e8d855593122bfa247f332357377 WatchSource:0}: Error finding container 8e38eb6bfaed2122010b2c853d68d72f83c0e8d855593122bfa247f332357377: Status 404 returned error can't find the container with id 8e38eb6bfaed2122010b2c853d68d72f83c0e8d855593122bfa247f332357377 Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.893844 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" event={"ID":"00b79e1c-4a38-43c4-81b0-647cbc8f3acf","Type":"ContainerStarted","Data":"8e38eb6bfaed2122010b2c853d68d72f83c0e8d855593122bfa247f332357377"} Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.895971 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" event={"ID":"d1d6e1fe-3809-4973-bac7-6bdd60d74d74","Type":"ContainerStarted","Data":"b2640cea6c8470ee97b67693f710ae41486e6711c5447bea1aac98877b68216c"} Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.896018 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" event={"ID":"d1d6e1fe-3809-4973-bac7-6bdd60d74d74","Type":"ContainerStarted","Data":"d39e98b6014063fa19cc5a489c6239f4a500a6f4e641c7cd11423f0c5f9bfba2"} Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.897324 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.898443 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" event={"ID":"7893c5da-bc30-4072-a95b-01adc121b50b","Type":"ContainerStarted","Data":"c876cd33cf321e780f01ed147969d85e91090d0218d444687b79d1d7306a1c42"} Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.912246 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.971703 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:41 crc kubenswrapper[4834]: E1126 12:13:41.971853 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:42.471815023 +0000 UTC m=+120.379028376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972158 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a25744d0-f7b9-4349-a07c-9655dd2a84c2-config-volume\") pod \"dns-default-ztdb4\" (UID: \"a25744d0-f7b9-4349-a07c-9655dd2a84c2\") " pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972411 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkr44\" (UniqueName: \"kubernetes.io/projected/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-kube-api-access-dkr44\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972471 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qls4f\" (UniqueName: \"kubernetes.io/projected/34801d76-ed04-44db-a572-a7a43067159d-kube-api-access-qls4f\") pod \"downloads-7954f5f757-9rfmg\" (UID: \"34801d76-ed04-44db-a572-a7a43067159d\") " pod="openshift-console/downloads-7954f5f757-9rfmg" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972606 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f1fa120-0514-437c-9a02-2b3a0b9c9050-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-28tv5\" (UID: \"8f1fa120-0514-437c-9a02-2b3a0b9c9050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972648 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-mountpoint-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972667 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-audit\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972687 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6470256-c870-4267-8e33-542f83d4f07c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mg6qr\" (UID: \"e6470256-c870-4267-8e33-542f83d4f07c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972705 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzlm9\" (UniqueName: \"kubernetes.io/projected/f8ffe635-f56c-4c0b-915a-1c5aab084305-kube-api-access-hzlm9\") pod \"ingress-canary-qf9rv\" (UID: \"f8ffe635-f56c-4c0b-915a-1c5aab084305\") " pod="openshift-ingress-canary/ingress-canary-qf9rv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972732 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6t7\" (UniqueName: \"kubernetes.io/projected/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-kube-api-access-kk6t7\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972768 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-config\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972782 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-etcd-ca\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972808 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6470256-c870-4267-8e33-542f83d4f07c-config\") pod \"kube-apiserver-operator-766d6c64bb-mg6qr\" (UID: \"e6470256-c870-4267-8e33-542f83d4f07c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972824 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-trusted-ca-bundle\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972840 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1fa120-0514-437c-9a02-2b3a0b9c9050-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-28tv5\" (UID: \"8f1fa120-0514-437c-9a02-2b3a0b9c9050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972854 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48hv5\" (UniqueName: \"kubernetes.io/projected/c425edda-322b-4b12-bf02-48ceeeed9a59-kube-api-access-48hv5\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972871 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d700278-027d-44e8-b8e3-c262fc0a6b44-config\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972896 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/60f10e1e-0000-46ee-9c26-d79f680a79df-etcd-client\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972913 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8njb\" (UniqueName: \"kubernetes.io/projected/60f10e1e-0000-46ee-9c26-d79f680a79df-kube-api-access-c8njb\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972928 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vcj6\" (UniqueName: \"kubernetes.io/projected/280cee19-adbf-4307-ac10-337b76f6b6d1-kube-api-access-4vcj6\") pod \"collect-profiles-29402640-25hcz\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972942 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34fa680-13b0-492a-a80d-c2ef0709efb4-config\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972965 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-etcd-service-ca\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.972981 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9632af41-e005-4c92-b162-8a7fb760ac72-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w5jw8\" (UID: \"9632af41-e005-4c92-b162-8a7fb760ac72\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973004 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f1fa120-0514-437c-9a02-2b3a0b9c9050-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-28tv5\" (UID: \"8f1fa120-0514-437c-9a02-2b3a0b9c9050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973017 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8rhm\" (UniqueName: \"kubernetes.io/projected/a53f903b-5b01-467d-be5c-28aabf3dc48b-kube-api-access-g8rhm\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973031 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-config\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973047 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9a6cf357-27ce-4d2e-910a-b7223815b902-profile-collector-cert\") pod \"catalog-operator-68c6474976-zbj9r\" (UID: \"9a6cf357-27ce-4d2e-910a-b7223815b902\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973067 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973082 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6470256-c870-4267-8e33-542f83d4f07c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mg6qr\" (UID: \"e6470256-c870-4267-8e33-542f83d4f07c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973096 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc5g6\" (UniqueName: \"kubernetes.io/projected/166b9d5c-74b2-4891-bbba-f4c2e81b683d-kube-api-access-vc5g6\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973115 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d5dd7f9-7abf-4c5e-ad19-96e010400267-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dhk4\" (UID: \"2d5dd7f9-7abf-4c5e-ad19-96e010400267\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973130 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-serving-cert\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973147 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a53f903b-5b01-467d-be5c-28aabf3dc48b-config\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973163 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsm7g\" (UniqueName: \"kubernetes.io/projected/5cfef378-bc85-4056-bccd-fbfec5535325-kube-api-access-xsm7g\") pod \"dns-operator-744455d44c-8j9fr\" (UID: \"5cfef378-bc85-4056-bccd-fbfec5535325\") " pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973178 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-audit-policies\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973204 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-service-ca-bundle\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973217 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f10e1e-0000-46ee-9c26-d79f680a79df-serving-cert\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973248 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-etcd-client\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973265 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-certificates\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973281 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jxsz\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-kube-api-access-7jxsz\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973295 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-config\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973324 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-config\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973342 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-oauth-config\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973368 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbgrg\" (UniqueName: \"kubernetes.io/projected/fb88bb19-bdc2-44e9-9c43-3dc00e419563-kube-api-access-vbgrg\") pod \"machine-config-server-n7k2p\" (UID: \"fb88bb19-bdc2-44e9-9c43-3dc00e419563\") " pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973385 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gs6v\" (UniqueName: \"kubernetes.io/projected/a1ea3730-6a71-4f4c-8588-7e39021bdb3d-kube-api-access-7gs6v\") pod \"machine-config-controller-84d6567774-dhx2d\" (UID: \"a1ea3730-6a71-4f4c-8588-7e39021bdb3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973398 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/60f10e1e-0000-46ee-9c26-d79f680a79df-node-pullsecrets\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973419 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973434 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973448 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nx9t\" (UniqueName: \"kubernetes.io/projected/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-kube-api-access-4nx9t\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973463 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c425edda-322b-4b12-bf02-48ceeeed9a59-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973477 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdbbc\" (UniqueName: \"kubernetes.io/projected/a25744d0-f7b9-4349-a07c-9655dd2a84c2-kube-api-access-mdbbc\") pod \"dns-default-ztdb4\" (UID: \"a25744d0-f7b9-4349-a07c-9655dd2a84c2\") " pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973494 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-j2jb2\" (UID: \"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973508 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a1ea3730-6a71-4f4c-8588-7e39021bdb3d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dhx2d\" (UID: \"a1ea3730-6a71-4f4c-8588-7e39021bdb3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973524 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-client-ca\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973537 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-socket-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973550 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n9xx\" (UniqueName: \"kubernetes.io/projected/9632af41-e005-4c92-b162-8a7fb760ac72-kube-api-access-8n9xx\") pod \"olm-operator-6b444d44fb-w5jw8\" (UID: \"9632af41-e005-4c92-b162-8a7fb760ac72\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973569 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-serving-cert\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973583 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfcjf\" (UniqueName: \"kubernetes.io/projected/059853c5-abf6-459b-9719-cc6688fd6e0a-kube-api-access-hfcjf\") pod \"multus-admission-controller-857f4d67dd-tlgd5\" (UID: \"059853c5-abf6-459b-9719-cc6688fd6e0a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973598 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973615 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fb88bb19-bdc2-44e9-9c43-3dc00e419563-node-bootstrap-token\") pod \"machine-config-server-n7k2p\" (UID: \"fb88bb19-bdc2-44e9-9c43-3dc00e419563\") " pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973631 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw47q\" (UniqueName: \"kubernetes.io/projected/4e332248-0575-4104-a6af-888110da84c0-kube-api-access-hw47q\") pod \"migrator-59844c95c7-bq2xl\" (UID: \"4e332248-0575-4104-a6af-888110da84c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973649 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-plugins-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973692 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-config\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973701 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-config\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973706 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tm6g\" (UniqueName: \"kubernetes.io/projected/17225c8b-a6dd-4958-b1be-58b3cc4ad317-kube-api-access-7tm6g\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973767 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973787 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262058ab-c269-4132-84e9-a244944bcc0d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtjlx\" (UID: \"262058ab-c269-4132-84e9-a244944bcc0d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973824 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fb88bb19-bdc2-44e9-9c43-3dc00e419563-certs\") pod \"machine-config-server-n7k2p\" (UID: \"fb88bb19-bdc2-44e9-9c43-3dc00e419563\") " pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973854 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-image-import-ca\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973877 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6mz2\" (UniqueName: \"kubernetes.io/projected/0d700278-027d-44e8-b8e3-c262fc0a6b44-kube-api-access-p6mz2\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973924 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973940 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973957 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/280cee19-adbf-4307-ac10-337b76f6b6d1-config-volume\") pod \"collect-profiles-29402640-25hcz\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.973996 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-trusted-ca\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974012 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9scc\" (UniqueName: \"kubernetes.io/projected/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-kube-api-access-b9scc\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974029 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974044 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974080 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r97rn\" (UniqueName: \"kubernetes.io/projected/262058ab-c269-4132-84e9-a244944bcc0d-kube-api-access-r97rn\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtjlx\" (UID: \"262058ab-c269-4132-84e9-a244944bcc0d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974095 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8ffe635-f56c-4c0b-915a-1c5aab084305-cert\") pod \"ingress-canary-qf9rv\" (UID: \"f8ffe635-f56c-4c0b-915a-1c5aab084305\") " pod="openshift-ingress-canary/ingress-canary-qf9rv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974119 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rds48\" (UniqueName: \"kubernetes.io/projected/6c485dff-89c6-4d40-8ba4-3b69ac68820e-kube-api-access-rds48\") pod \"marketplace-operator-79b997595-pcv95\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974154 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-config\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974170 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn94q\" (UniqueName: \"kubernetes.io/projected/9a6cf357-27ce-4d2e-910a-b7223815b902-kube-api-access-bn94q\") pod \"catalog-operator-68c6474976-zbj9r\" (UID: \"9a6cf357-27ce-4d2e-910a-b7223815b902\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974194 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/262058ab-c269-4132-84e9-a244944bcc0d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtjlx\" (UID: \"262058ab-c269-4132-84e9-a244944bcc0d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974230 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-service-ca-bundle\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974246 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2fa5531-03be-4051-968e-a3b00820266e-audit-dir\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974258 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-etcd-ca\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974261 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f66f2\" (UniqueName: \"kubernetes.io/projected/c2fa5531-03be-4051-968e-a3b00820266e-kube-api-access-f66f2\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974324 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974348 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-serving-cert\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974368 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/280cee19-adbf-4307-ac10-337b76f6b6d1-secret-volume\") pod \"collect-profiles-29402640-25hcz\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974387 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56qxv\" (UniqueName: \"kubernetes.io/projected/a5dda160-6a44-4f03-b9a4-9baeeae03a54-kube-api-access-56qxv\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974403 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a25744d0-f7b9-4349-a07c-9655dd2a84c2-metrics-tls\") pod \"dns-default-ztdb4\" (UID: \"a25744d0-f7b9-4349-a07c-9655dd2a84c2\") " pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974424 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974440 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a53f903b-5b01-467d-be5c-28aabf3dc48b-machine-approver-tls\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974465 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns569\" (UniqueName: \"kubernetes.io/projected/b34fa680-13b0-492a-a80d-c2ef0709efb4-kube-api-access-ns569\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974495 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mrxf\" (UniqueName: \"kubernetes.io/projected/12016219-99d2-4285-8112-a92858f97d1f-kube-api-access-7mrxf\") pod \"package-server-manager-789f6589d5-nt8mw\" (UID: \"12016219-99d2-4285-8112-a92858f97d1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974532 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-metrics-tls\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974557 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-tls\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974573 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/60f10e1e-0000-46ee-9c26-d79f680a79df-encryption-config\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974587 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb2n2\" (UniqueName: \"kubernetes.io/projected/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-kube-api-access-mb2n2\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974602 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974615 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-registration-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974632 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-oauth-serving-cert\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974647 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8501154d-b03b-41f2-9b7d-c2d834718044-signing-cabundle\") pod \"service-ca-9c57cc56f-bvjpq\" (UID: \"8501154d-b03b-41f2-9b7d-c2d834718044\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974660 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9632af41-e005-4c92-b162-8a7fb760ac72-srv-cert\") pod \"olm-operator-6b444d44fb-w5jw8\" (UID: \"9632af41-e005-4c92-b162-8a7fb760ac72\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974700 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974718 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-metrics-certs\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974765 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1ea3730-6a71-4f4c-8588-7e39021bdb3d-proxy-tls\") pod \"machine-config-controller-84d6567774-dhx2d\" (UID: \"a1ea3730-6a71-4f4c-8588-7e39021bdb3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974780 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pcv95\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974807 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c425edda-322b-4b12-bf02-48ceeeed9a59-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974822 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-serving-cert\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974836 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-default-certificate\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974873 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5dda160-6a44-4f03-b9a4-9baeeae03a54-serving-cert\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974892 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d700278-027d-44e8-b8e3-c262fc0a6b44-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974906 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-etcd-serving-ca\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974957 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b34fa680-13b0-492a-a80d-c2ef0709efb4-serving-cert\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974974 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-trusted-ca\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.974988 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/059853c5-abf6-459b-9719-cc6688fd6e0a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tlgd5\" (UID: \"059853c5-abf6-459b-9719-cc6688fd6e0a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975003 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975019 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a53f903b-5b01-467d-be5c-28aabf3dc48b-auth-proxy-config\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975035 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b34fa680-13b0-492a-a80d-c2ef0709efb4-trusted-ca\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975050 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c425edda-322b-4b12-bf02-48ceeeed9a59-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975065 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975085 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz64t\" (UniqueName: \"kubernetes.io/projected/8501154d-b03b-41f2-9b7d-c2d834718044-kube-api-access-bz64t\") pod \"service-ca-9c57cc56f-bvjpq\" (UID: \"8501154d-b03b-41f2-9b7d-c2d834718044\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975104 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nztgq\" (UniqueName: \"kubernetes.io/projected/0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7-kube-api-access-nztgq\") pod \"kube-storage-version-migrator-operator-b67b599dd-j2jb2\" (UID: \"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975121 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f95p4\" (UniqueName: \"kubernetes.io/projected/2d5dd7f9-7abf-4c5e-ad19-96e010400267-kube-api-access-f95p4\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dhk4\" (UID: \"2d5dd7f9-7abf-4c5e-ad19-96e010400267\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975137 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976420 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pcv95\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976481 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5cfef378-bc85-4056-bccd-fbfec5535325-metrics-tls\") pod \"dns-operator-744455d44c-8j9fr\" (UID: \"5cfef378-bc85-4056-bccd-fbfec5535325\") " pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976507 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-stats-auth\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976529 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0d700278-027d-44e8-b8e3-c262fc0a6b44-images\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976546 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-client-ca\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976567 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976590 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8501154d-b03b-41f2-9b7d-c2d834718044-signing-key\") pod \"service-ca-9c57cc56f-bvjpq\" (UID: \"8501154d-b03b-41f2-9b7d-c2d834718044\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977108 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/280cee19-adbf-4307-ac10-337b76f6b6d1-config-volume\") pod \"collect-profiles-29402640-25hcz\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977220 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-service-ca\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977267 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/12016219-99d2-4285-8112-a92858f97d1f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nt8mw\" (UID: \"12016219-99d2-4285-8112-a92858f97d1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977297 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977419 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-csi-data-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977444 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-bound-sa-token\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977461 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-j2jb2\" (UID: \"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977481 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60f10e1e-0000-46ee-9c26-d79f680a79df-audit-dir\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977511 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9a6cf357-27ce-4d2e-910a-b7223815b902-srv-cert\") pod \"catalog-operator-68c6474976-zbj9r\" (UID: \"9a6cf357-27ce-4d2e-910a-b7223815b902\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.984205 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-trusted-ca\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.984558 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-trusted-ca\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.984768 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-config\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.984881 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-config\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.984900 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d700278-027d-44e8-b8e3-c262fc0a6b44-config\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.985556 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/280cee19-adbf-4307-ac10-337b76f6b6d1-secret-volume\") pod \"collect-profiles-29402640-25hcz\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.986242 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.986392 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-serving-cert\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.986884 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.987780 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-tls\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.988232 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1fa120-0514-437c-9a02-2b3a0b9c9050-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-28tv5\" (UID: \"8f1fa120-0514-437c-9a02-2b3a0b9c9050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.975833 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-config\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976066 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f1fa120-0514-437c-9a02-2b3a0b9c9050-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-28tv5\" (UID: \"8f1fa120-0514-437c-9a02-2b3a0b9c9050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.977375 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-trusted-ca-bundle\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976205 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-audit\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.976329 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6470256-c870-4267-8e33-542f83d4f07c-config\") pod \"kube-apiserver-operator-766d6c64bb-mg6qr\" (UID: \"e6470256-c870-4267-8e33-542f83d4f07c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.989807 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-oauth-config\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.990900 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.991406 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m"] Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.994496 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-default-certificate\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.995522 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/60f10e1e-0000-46ee-9c26-d79f680a79df-node-pullsecrets\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.996304 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262058ab-c269-4132-84e9-a244944bcc0d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtjlx\" (UID: \"262058ab-c269-4132-84e9-a244944bcc0d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.997140 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-etcd-service-ca\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.997253 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.997533 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-service-ca\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.997556 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-config\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.997757 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34fa680-13b0-492a-a80d-c2ef0709efb4-config\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:41 crc kubenswrapper[4834]: I1126 12:13:41.998383 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.000821 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-client-ca\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.001331 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60f10e1e-0000-46ee-9c26-d79f680a79df-audit-dir\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.001446 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.001663 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:42.501644194 +0000 UTC m=+120.408857546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.001677 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.001777 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/60f10e1e-0000-46ee-9c26-d79f680a79df-etcd-client\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.001770 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.002462 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-image-import-ca\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.003272 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.004459 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.004578 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pcv95\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.006397 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-j2jb2\" (UID: \"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.006963 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-metrics-certs\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.007412 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6470256-c870-4267-8e33-542f83d4f07c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mg6qr\" (UID: \"e6470256-c870-4267-8e33-542f83d4f07c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.007717 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/059853c5-abf6-459b-9719-cc6688fd6e0a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tlgd5\" (UID: \"059853c5-abf6-459b-9719-cc6688fd6e0a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.008398 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-service-ca-bundle\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.008431 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2fa5531-03be-4051-968e-a3b00820266e-audit-dir\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.008436 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b34fa680-13b0-492a-a80d-c2ef0709efb4-serving-cert\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.008723 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0d700278-027d-44e8-b8e3-c262fc0a6b44-images\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.008846 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a53f903b-5b01-467d-be5c-28aabf3dc48b-config\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.009044 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-audit-policies\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.009401 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-client-ca\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.009505 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c425edda-322b-4b12-bf02-48ceeeed9a59-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.009590 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-oauth-serving-cert\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.010487 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-j2jb2\" (UID: \"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.011095 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a1ea3730-6a71-4f4c-8588-7e39021bdb3d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dhx2d\" (UID: \"a1ea3730-6a71-4f4c-8588-7e39021bdb3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.011298 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-stats-auth\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.011406 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-config\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.012277 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-service-ca-bundle\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.013548 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.013626 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-metrics-tls\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.014000 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a53f903b-5b01-467d-be5c-28aabf3dc48b-auth-proxy-config\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.014033 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5dda160-6a44-4f03-b9a4-9baeeae03a54-serving-cert\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.014700 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.014765 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b34fa680-13b0-492a-a80d-c2ef0709efb4-trusted-ca\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.014928 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c425edda-322b-4b12-bf02-48ceeeed9a59-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.015274 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-certificates\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.015850 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/60f10e1e-0000-46ee-9c26-d79f680a79df-etcd-serving-ca\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.017335 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f10e1e-0000-46ee-9c26-d79f680a79df-serving-cert\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.017430 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d700278-027d-44e8-b8e3-c262fc0a6b44-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.017625 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/60f10e1e-0000-46ee-9c26-d79f680a79df-encryption-config\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.017656 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pcv95\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.017765 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-serving-cert\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.017934 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/262058ab-c269-4132-84e9-a244944bcc0d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtjlx\" (UID: \"262058ab-c269-4132-84e9-a244944bcc0d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.018239 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a53f903b-5b01-467d-be5c-28aabf3dc48b-machine-approver-tls\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.018238 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.018972 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d5dd7f9-7abf-4c5e-ad19-96e010400267-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dhk4\" (UID: \"2d5dd7f9-7abf-4c5e-ad19-96e010400267\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.019347 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qls4f\" (UniqueName: \"kubernetes.io/projected/34801d76-ed04-44db-a572-a7a43067159d-kube-api-access-qls4f\") pod \"downloads-7954f5f757-9rfmg\" (UID: \"34801d76-ed04-44db-a572-a7a43067159d\") " pod="openshift-console/downloads-7954f5f757-9rfmg" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.019364 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5cfef378-bc85-4056-bccd-fbfec5535325-metrics-tls\") pod \"dns-operator-744455d44c-8j9fr\" (UID: \"5cfef378-bc85-4056-bccd-fbfec5535325\") " pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.019417 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-serving-cert\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.018492 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.020745 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.021819 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-etcd-client\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.022645 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1ea3730-6a71-4f4c-8588-7e39021bdb3d-proxy-tls\") pod \"machine-config-controller-84d6567774-dhx2d\" (UID: \"a1ea3730-6a71-4f4c-8588-7e39021bdb3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.023727 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-serving-cert\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.026409 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.033887 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tm6g\" (UniqueName: \"kubernetes.io/projected/17225c8b-a6dd-4958-b1be-58b3cc4ad317-kube-api-access-7tm6g\") pod \"console-f9d7485db-z2dpf\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.036433 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-625th"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.051768 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f66f2\" (UniqueName: \"kubernetes.io/projected/c2fa5531-03be-4051-968e-a3b00820266e-kube-api-access-f66f2\") pod \"oauth-openshift-558db77b4-rt9fj\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: W1126 12:13:42.053834 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4585ceb_bed5_4ab6_917e_15778b7a0f4d.slice/crio-c523e192963774ff3b1bff6fdd37ef438091d2d4af7bfdf3993cf7c37a4f5904 WatchSource:0}: Error finding container c523e192963774ff3b1bff6fdd37ef438091d2d4af7bfdf3993cf7c37a4f5904: Status 404 returned error can't find the container with id c523e192963774ff3b1bff6fdd37ef438091d2d4af7bfdf3993cf7c37a4f5904 Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.064132 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.069952 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkr44\" (UniqueName: \"kubernetes.io/projected/482e95e9-e2bc-4a48-8129-b5db9e7d26ac-kube-api-access-dkr44\") pod \"authentication-operator-69f744f599-w4l9d\" (UID: \"482e95e9-e2bc-4a48-8129-b5db9e7d26ac\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.080510 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.080714 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.080846 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:42.580827017 +0000 UTC m=+120.488040360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.080996 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-mountpoint-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081032 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzlm9\" (UniqueName: \"kubernetes.io/projected/f8ffe635-f56c-4c0b-915a-1c5aab084305-kube-api-access-hzlm9\") pod \"ingress-canary-qf9rv\" (UID: \"f8ffe635-f56c-4c0b-915a-1c5aab084305\") " pod="openshift-ingress-canary/ingress-canary-qf9rv" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081089 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9632af41-e005-4c92-b162-8a7fb760ac72-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w5jw8\" (UID: \"9632af41-e005-4c92-b162-8a7fb760ac72\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081119 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9a6cf357-27ce-4d2e-910a-b7223815b902-profile-collector-cert\") pod \"catalog-operator-68c6474976-zbj9r\" (UID: \"9a6cf357-27ce-4d2e-910a-b7223815b902\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081136 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc5g6\" (UniqueName: \"kubernetes.io/projected/166b9d5c-74b2-4891-bbba-f4c2e81b683d-kube-api-access-vc5g6\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081169 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbgrg\" (UniqueName: \"kubernetes.io/projected/fb88bb19-bdc2-44e9-9c43-3dc00e419563-kube-api-access-vbgrg\") pod \"machine-config-server-n7k2p\" (UID: \"fb88bb19-bdc2-44e9-9c43-3dc00e419563\") " pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081208 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081232 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdbbc\" (UniqueName: \"kubernetes.io/projected/a25744d0-f7b9-4349-a07c-9655dd2a84c2-kube-api-access-mdbbc\") pod \"dns-default-ztdb4\" (UID: \"a25744d0-f7b9-4349-a07c-9655dd2a84c2\") " pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081248 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-socket-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081263 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n9xx\" (UniqueName: \"kubernetes.io/projected/9632af41-e005-4c92-b162-8a7fb760ac72-kube-api-access-8n9xx\") pod \"olm-operator-6b444d44fb-w5jw8\" (UID: \"9632af41-e005-4c92-b162-8a7fb760ac72\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081289 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fb88bb19-bdc2-44e9-9c43-3dc00e419563-node-bootstrap-token\") pod \"machine-config-server-n7k2p\" (UID: \"fb88bb19-bdc2-44e9-9c43-3dc00e419563\") " pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081304 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-mountpoint-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081346 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-plugins-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081373 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fb88bb19-bdc2-44e9-9c43-3dc00e419563-certs\") pod \"machine-config-server-n7k2p\" (UID: \"fb88bb19-bdc2-44e9-9c43-3dc00e419563\") " pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081408 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8ffe635-f56c-4c0b-915a-1c5aab084305-cert\") pod \"ingress-canary-qf9rv\" (UID: \"f8ffe635-f56c-4c0b-915a-1c5aab084305\") " pod="openshift-ingress-canary/ingress-canary-qf9rv" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081431 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn94q\" (UniqueName: \"kubernetes.io/projected/9a6cf357-27ce-4d2e-910a-b7223815b902-kube-api-access-bn94q\") pod \"catalog-operator-68c6474976-zbj9r\" (UID: \"9a6cf357-27ce-4d2e-910a-b7223815b902\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081454 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a25744d0-f7b9-4349-a07c-9655dd2a84c2-metrics-tls\") pod \"dns-default-ztdb4\" (UID: \"a25744d0-f7b9-4349-a07c-9655dd2a84c2\") " pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081489 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mrxf\" (UniqueName: \"kubernetes.io/projected/12016219-99d2-4285-8112-a92858f97d1f-kube-api-access-7mrxf\") pod \"package-server-manager-789f6589d5-nt8mw\" (UID: \"12016219-99d2-4285-8112-a92858f97d1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081524 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-registration-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081541 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8501154d-b03b-41f2-9b7d-c2d834718044-signing-cabundle\") pod \"service-ca-9c57cc56f-bvjpq\" (UID: \"8501154d-b03b-41f2-9b7d-c2d834718044\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081558 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9632af41-e005-4c92-b162-8a7fb760ac72-srv-cert\") pod \"olm-operator-6b444d44fb-w5jw8\" (UID: \"9632af41-e005-4c92-b162-8a7fb760ac72\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081587 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz64t\" (UniqueName: \"kubernetes.io/projected/8501154d-b03b-41f2-9b7d-c2d834718044-kube-api-access-bz64t\") pod \"service-ca-9c57cc56f-bvjpq\" (UID: \"8501154d-b03b-41f2-9b7d-c2d834718044\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081617 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8501154d-b03b-41f2-9b7d-c2d834718044-signing-key\") pod \"service-ca-9c57cc56f-bvjpq\" (UID: \"8501154d-b03b-41f2-9b7d-c2d834718044\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081634 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/12016219-99d2-4285-8112-a92858f97d1f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nt8mw\" (UID: \"12016219-99d2-4285-8112-a92858f97d1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081651 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-csi-data-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081678 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9a6cf357-27ce-4d2e-910a-b7223815b902-srv-cert\") pod \"catalog-operator-68c6474976-zbj9r\" (UID: \"9a6cf357-27ce-4d2e-910a-b7223815b902\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.081691 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a25744d0-f7b9-4349-a07c-9655dd2a84c2-config-volume\") pod \"dns-default-ztdb4\" (UID: \"a25744d0-f7b9-4349-a07c-9655dd2a84c2\") " pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.082890 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-registration-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.083542 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8501154d-b03b-41f2-9b7d-c2d834718044-signing-cabundle\") pod \"service-ca-9c57cc56f-bvjpq\" (UID: \"8501154d-b03b-41f2-9b7d-c2d834718044\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.084048 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a25744d0-f7b9-4349-a07c-9655dd2a84c2-config-volume\") pod \"dns-default-ztdb4\" (UID: \"a25744d0-f7b9-4349-a07c-9655dd2a84c2\") " pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.084124 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-csi-data-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.087720 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-plugins-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.087726 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.087757 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/166b9d5c-74b2-4891-bbba-f4c2e81b683d-socket-dir\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.087812 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:42.587798108 +0000 UTC m=+120.495011459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.088243 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8501154d-b03b-41f2-9b7d-c2d834718044-signing-key\") pod \"service-ca-9c57cc56f-bvjpq\" (UID: \"8501154d-b03b-41f2-9b7d-c2d834718044\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.088406 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9632af41-e005-4c92-b162-8a7fb760ac72-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w5jw8\" (UID: \"9632af41-e005-4c92-b162-8a7fb760ac72\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.089749 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9632af41-e005-4c92-b162-8a7fb760ac72-srv-cert\") pod \"olm-operator-6b444d44fb-w5jw8\" (UID: \"9632af41-e005-4c92-b162-8a7fb760ac72\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.090132 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a25744d0-f7b9-4349-a07c-9655dd2a84c2-metrics-tls\") pod \"dns-default-ztdb4\" (UID: \"a25744d0-f7b9-4349-a07c-9655dd2a84c2\") " pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.090293 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9a6cf357-27ce-4d2e-910a-b7223815b902-profile-collector-cert\") pod \"catalog-operator-68c6474976-zbj9r\" (UID: \"9a6cf357-27ce-4d2e-910a-b7223815b902\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.090419 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fb88bb19-bdc2-44e9-9c43-3dc00e419563-certs\") pod \"machine-config-server-n7k2p\" (UID: \"fb88bb19-bdc2-44e9-9c43-3dc00e419563\") " pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.091063 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9a6cf357-27ce-4d2e-910a-b7223815b902-srv-cert\") pod \"catalog-operator-68c6474976-zbj9r\" (UID: \"9a6cf357-27ce-4d2e-910a-b7223815b902\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.091865 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/12016219-99d2-4285-8112-a92858f97d1f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nt8mw\" (UID: \"12016219-99d2-4285-8112-a92858f97d1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.092263 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8ffe635-f56c-4c0b-915a-1c5aab084305-cert\") pod \"ingress-canary-qf9rv\" (UID: \"f8ffe635-f56c-4c0b-915a-1c5aab084305\") " pod="openshift-ingress-canary/ingress-canary-qf9rv" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.093212 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rds48\" (UniqueName: \"kubernetes.io/projected/6c485dff-89c6-4d40-8ba4-3b69ac68820e-kube-api-access-rds48\") pod \"marketplace-operator-79b997595-pcv95\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.095297 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fb88bb19-bdc2-44e9-9c43-3dc00e419563-node-bootstrap-token\") pod \"machine-config-server-n7k2p\" (UID: \"fb88bb19-bdc2-44e9-9c43-3dc00e419563\") " pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.111900 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.114157 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48hv5\" (UniqueName: \"kubernetes.io/projected/c425edda-322b-4b12-bf02-48ceeeed9a59-kube-api-access-48hv5\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.130066 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9scc\" (UniqueName: \"kubernetes.io/projected/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-kube-api-access-b9scc\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.130948 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9rfmg" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.172048 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-bound-sa-token\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.182722 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.183471 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:42.683455442 +0000 UTC m=+120.590668794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.183872 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.191466 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8njb\" (UniqueName: \"kubernetes.io/projected/60f10e1e-0000-46ee-9c26-d79f680a79df-kube-api-access-c8njb\") pod \"apiserver-76f77b778f-hvfh2\" (UID: \"60f10e1e-0000-46ee-9c26-d79f680a79df\") " pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.210917 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vcj6\" (UniqueName: \"kubernetes.io/projected/280cee19-adbf-4307-ac10-337b76f6b6d1-kube-api-access-4vcj6\") pod \"collect-profiles-29402640-25hcz\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.233406 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gs6v\" (UniqueName: \"kubernetes.io/projected/a1ea3730-6a71-4f4c-8588-7e39021bdb3d-kube-api-access-7gs6v\") pod \"machine-config-controller-84d6567774-dhx2d\" (UID: \"a1ea3730-6a71-4f4c-8588-7e39021bdb3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.238975 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.240532 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.256480 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f1fa120-0514-437c-9a02-2b3a0b9c9050-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-28tv5\" (UID: \"8f1fa120-0514-437c-9a02-2b3a0b9c9050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.258438 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.258649 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rt9fj"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.258781 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.269532 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.273463 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.279801 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8rhm\" (UniqueName: \"kubernetes.io/projected/a53f903b-5b01-467d-be5c-28aabf3dc48b-kube-api-access-g8rhm\") pod \"machine-approver-56656f9798-5flwm\" (UID: \"a53f903b-5b01-467d-be5c-28aabf3dc48b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.292081 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.292390 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:42.792377956 +0000 UTC m=+120.699591308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.298427 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r97rn\" (UniqueName: \"kubernetes.io/projected/262058ab-c269-4132-84e9-a244944bcc0d-kube-api-access-r97rn\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtjlx\" (UID: \"262058ab-c269-4132-84e9-a244944bcc0d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.322900 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6mz2\" (UniqueName: \"kubernetes.io/projected/0d700278-027d-44e8-b8e3-c262fc0a6b44-kube-api-access-p6mz2\") pod \"machine-api-operator-5694c8668f-jztgs\" (UID: \"0d700278-027d-44e8-b8e3-c262fc0a6b44\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.324096 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w4l9d"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.336098 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw47q\" (UniqueName: \"kubernetes.io/projected/4e332248-0575-4104-a6af-888110da84c0-kube-api-access-hw47q\") pod \"migrator-59844c95c7-bq2xl\" (UID: \"4e332248-0575-4104-a6af-888110da84c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.358571 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nztgq\" (UniqueName: \"kubernetes.io/projected/0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7-kube-api-access-nztgq\") pod \"kube-storage-version-migrator-operator-b67b599dd-j2jb2\" (UID: \"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.370579 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9rfmg"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.383093 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f95p4\" (UniqueName: \"kubernetes.io/projected/2d5dd7f9-7abf-4c5e-ad19-96e010400267-kube-api-access-f95p4\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dhk4\" (UID: \"2d5dd7f9-7abf-4c5e-ad19-96e010400267\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.393161 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.393759 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:42.893743042 +0000 UTC m=+120.800956394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.438934 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f23f3e03-50f8-4160-b1f9-6f92ce90c73c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8xlp7\" (UID: \"f23f3e03-50f8-4160-b1f9-6f92ce90c73c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.475047 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-z2dpf"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.495098 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.495604 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:42.995590028 +0000 UTC m=+120.902803380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.514584 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c425edda-322b-4b12-bf02-48ceeeed9a59-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bkfhv\" (UID: \"c425edda-322b-4b12-bf02-48ceeeed9a59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.572837 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfcjf\" (UniqueName: \"kubernetes.io/projected/059853c5-abf6-459b-9719-cc6688fd6e0a-kube-api-access-hfcjf\") pod \"multus-admission-controller-857f4d67dd-tlgd5\" (UID: \"059853c5-abf6-459b-9719-cc6688fd6e0a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.576642 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcv95"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.592139 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jxsz\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-kube-api-access-7jxsz\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.597385 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.598035 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.098009678 +0000 UTC m=+121.005223040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.614592 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d"] Nov 26 12:13:42 crc kubenswrapper[4834]: W1126 12:13:42.620540 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c485dff_89c6_4d40_8ba4_3b69ac68820e.slice/crio-03e8ad3e53abeae126470c6feb9580c98d7fde733b4cde7511cb6ca2afd18778 WatchSource:0}: Error finding container 03e8ad3e53abeae126470c6feb9580c98d7fde733b4cde7511cb6ca2afd18778: Status 404 returned error can't find the container with id 03e8ad3e53abeae126470c6feb9580c98d7fde733b4cde7511cb6ca2afd18778 Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.622387 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.631822 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbgrg\" (UniqueName: \"kubernetes.io/projected/fb88bb19-bdc2-44e9-9c43-3dc00e419563-kube-api-access-vbgrg\") pod \"machine-config-server-n7k2p\" (UID: \"fb88bb19-bdc2-44e9-9c43-3dc00e419563\") " pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:42 crc kubenswrapper[4834]: W1126 12:13:42.640444 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f1fa120_0514_437c_9a02_2b3a0b9c9050.slice/crio-92852bbe957b2fdb4bb2025667785e0c79b1f346384fa8d5f2ed25c961dc0f11 WatchSource:0}: Error finding container 92852bbe957b2fdb4bb2025667785e0c79b1f346384fa8d5f2ed25c961dc0f11: Status 404 returned error can't find the container with id 92852bbe957b2fdb4bb2025667785e0c79b1f346384fa8d5f2ed25c961dc0f11 Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.691435 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.697265 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mrxf\" (UniqueName: \"kubernetes.io/projected/12016219-99d2-4285-8112-a92858f97d1f-kube-api-access-7mrxf\") pod \"package-server-manager-789f6589d5-nt8mw\" (UID: \"12016219-99d2-4285-8112-a92858f97d1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.698641 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.699074 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.199061961 +0000 UTC m=+121.106275313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.710947 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn94q\" (UniqueName: \"kubernetes.io/projected/9a6cf357-27ce-4d2e-910a-b7223815b902-kube-api-access-bn94q\") pod \"catalog-operator-68c6474976-zbj9r\" (UID: \"9a6cf357-27ce-4d2e-910a-b7223815b902\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.727968 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hvfh2"] Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.729891 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdbbc\" (UniqueName: \"kubernetes.io/projected/a25744d0-f7b9-4349-a07c-9655dd2a84c2-kube-api-access-mdbbc\") pod \"dns-default-ztdb4\" (UID: \"a25744d0-f7b9-4349-a07c-9655dd2a84c2\") " pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:42 crc kubenswrapper[4834]: W1126 12:13:42.740852 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f10e1e_0000_46ee_9c26_d79f680a79df.slice/crio-eba6cfe0f00cb6c54dfb8a06c74a94a4cd63b22386d8d840a140cb387b6ae729 WatchSource:0}: Error finding container eba6cfe0f00cb6c54dfb8a06c74a94a4cd63b22386d8d840a140cb387b6ae729: Status 404 returned error can't find the container with id eba6cfe0f00cb6c54dfb8a06c74a94a4cd63b22386d8d840a140cb387b6ae729 Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.750442 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n9xx\" (UniqueName: \"kubernetes.io/projected/9632af41-e005-4c92-b162-8a7fb760ac72-kube-api-access-8n9xx\") pod \"olm-operator-6b444d44fb-w5jw8\" (UID: \"9632af41-e005-4c92-b162-8a7fb760ac72\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.778525 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.791464 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6470256-c870-4267-8e33-542f83d4f07c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mg6qr\" (UID: \"e6470256-c870-4267-8e33-542f83d4f07c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.799607 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.800369 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.300209835 +0000 UTC m=+121.207423186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.800532 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.803229 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.818675 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.837013 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.861561 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.878667 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.878758 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.882967 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.893361 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.899086 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.901408 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:42 crc kubenswrapper[4834]: E1126 12:13:42.901754 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.401741845 +0000 UTC m=+121.308955197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.907534 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.922698 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.925616 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" event={"ID":"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1","Type":"ContainerStarted","Data":"55ab93e5c37ca46ed86937e12345823b701f428e2208394cce076e24f5decf82"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.925666 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" event={"ID":"ea236e0f-7859-40a9-b92d-a0a54ce9d6f1","Type":"ContainerStarted","Data":"8a2efb7a7dfd9c863d2cda6bcc755198cb45646b7a23b8e1c66a5838699e2c10"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.929530 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.933032 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.939294 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.939629 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.939973 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" event={"ID":"00b79e1c-4a38-43c4-81b0-647cbc8f3acf","Type":"ContainerStarted","Data":"4bb9484315e990a9d74609917b3855aa2aae98f69ece40d1690e4a4bb032a1f9"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.951322 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z2dpf" event={"ID":"17225c8b-a6dd-4958-b1be-58b3cc4ad317","Type":"ContainerStarted","Data":"073177f09678c962a9da978555323d2dd073f556577ea158fb4849d48bcfbbb5"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.951362 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z2dpf" event={"ID":"17225c8b-a6dd-4958-b1be-58b3cc4ad317","Type":"ContainerStarted","Data":"166ddd96c15e3cb627d255e096a48504057654aea0a71d0ec5238a4e0f786fbd"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.956609 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" event={"ID":"8f1fa120-0514-437c-9a02-2b3a0b9c9050","Type":"ContainerStarted","Data":"b293c15eecfa0076794d7e23df81e3923f0855684088c867f07dfec0d090ab5f"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.956643 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" event={"ID":"8f1fa120-0514-437c-9a02-2b3a0b9c9050","Type":"ContainerStarted","Data":"92852bbe957b2fdb4bb2025667785e0c79b1f346384fa8d5f2ed25c961dc0f11"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.958770 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" event={"ID":"6c485dff-89c6-4d40-8ba4-3b69ac68820e","Type":"ContainerStarted","Data":"5c3e3de614293cd0a4d2675ba6eeaba6170a045d2cdc48d410129553d7c79d37"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.958802 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" event={"ID":"6c485dff-89c6-4d40-8ba4-3b69ac68820e","Type":"ContainerStarted","Data":"03e8ad3e53abeae126470c6feb9580c98d7fde733b4cde7511cb6ca2afd18778"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.959390 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.967424 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.969986 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" event={"ID":"482e95e9-e2bc-4a48-8129-b5db9e7d26ac","Type":"ContainerStarted","Data":"05731e7ee9afb8532daf4db9d4d74c1af77acd5aff9848394b6e974631f0df1c"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.970041 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" event={"ID":"482e95e9-e2bc-4a48-8129-b5db9e7d26ac","Type":"ContainerStarted","Data":"78953425733e7f79216654504f53deafbcfbf1df3b1bf3c2cb9e3e81df7649a8"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.977809 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.980769 4834 generic.go:334] "Generic (PLEG): container finished" podID="c4585ceb-bed5-4ab6-917e-15778b7a0f4d" containerID="5ab39720ad2cb73e620534a91cac9d8b5613ac30a50ab77778e3b285bab46d2d" exitCode=0 Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.980964 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" event={"ID":"c4585ceb-bed5-4ab6-917e-15778b7a0f4d","Type":"ContainerDied","Data":"5ab39720ad2cb73e620534a91cac9d8b5613ac30a50ab77778e3b285bab46d2d"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.981047 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" event={"ID":"c4585ceb-bed5-4ab6-917e-15778b7a0f4d","Type":"ContainerStarted","Data":"c523e192963774ff3b1bff6fdd37ef438091d2d4af7bfdf3993cf7c37a4f5904"} Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.987184 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.988648 4834 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pcv95 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.988698 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" podUID="6c485dff-89c6-4d40-8ba4-3b69ac68820e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Nov 26 12:13:42 crc kubenswrapper[4834]: I1126 12:13:42.996940 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.011924 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.012026 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.512002752 +0000 UTC m=+121.419216104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.015814 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.018834 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.518817085 +0000 UTC m=+121.426030437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.022550 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.023500 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" event={"ID":"a1ea3730-6a71-4f4c-8588-7e39021bdb3d","Type":"ContainerStarted","Data":"8a03f5db10649da8996ec36b90c54989cf85819cf2abf13678b8ef989d0c093f"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.023564 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" event={"ID":"a1ea3730-6a71-4f4c-8588-7e39021bdb3d","Type":"ContainerStarted","Data":"4f7a7b307a6462b8515af4e328fbadc0f96cf4ac0345e9abd9912635d4b8b480"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.032575 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.037788 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.043364 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" event={"ID":"7893c5da-bc30-4072-a95b-01adc121b50b","Type":"ContainerStarted","Data":"ebe9d8fd886a2b2b49d17812bea7926fdb29c8bc29b63fd170d77cc3a8dd6748"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.049453 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" event={"ID":"60f10e1e-0000-46ee-9c26-d79f680a79df","Type":"ContainerStarted","Data":"eba6cfe0f00cb6c54dfb8a06c74a94a4cd63b22386d8d840a140cb387b6ae729"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.055262 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" event={"ID":"c2fa5531-03be-4051-968e-a3b00820266e","Type":"ContainerStarted","Data":"d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.055333 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" event={"ID":"c2fa5531-03be-4051-968e-a3b00820266e","Type":"ContainerStarted","Data":"1ad3342216c5db83b213cd72414a17694f4a43a7bec3765798c1560e8bef0845"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.056085 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.058222 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.061559 4834 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-rt9fj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.061624 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" podUID="c2fa5531-03be-4051-968e-a3b00820266e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.061858 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.062658 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx"] Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.063868 4834 generic.go:334] "Generic (PLEG): container finished" podID="243172e8-801d-435a-a091-c436cd119f1f" containerID="2bc196b9963b99eba4775e0a92ea0c3b24b6925af7b29423e6080eaedc4a6ca8" exitCode=0 Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.063929 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" event={"ID":"243172e8-801d-435a-a091-c436cd119f1f","Type":"ContainerDied","Data":"2bc196b9963b99eba4775e0a92ea0c3b24b6925af7b29423e6080eaedc4a6ca8"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.063949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" event={"ID":"243172e8-801d-435a-a091-c436cd119f1f","Type":"ContainerStarted","Data":"819eeb3998a9bf468c7962b7a7444dca19aab39de66f3207eeca978948f61668"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.068098 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" event={"ID":"bcfca33b-1c7d-46ff-8f91-374eddf7b5b3","Type":"ContainerStarted","Data":"06167bb93b1494515721072bb5b93f24217daaa26c10093c1e16513620b576e5"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.068135 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" event={"ID":"bcfca33b-1c7d-46ff-8f91-374eddf7b5b3","Type":"ContainerStarted","Data":"300fa9c75ed29ee99a17776c7164252f0072c1420adea69e05dec81225543585"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.068146 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" event={"ID":"bcfca33b-1c7d-46ff-8f91-374eddf7b5b3","Type":"ContainerStarted","Data":"a692662c5ff64f57fc09e031cf5b7c2a79c827817b5bcd25cfac0d46a156947e"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.076818 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.082017 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9rfmg" event={"ID":"34801d76-ed04-44db-a572-a7a43067159d","Type":"ContainerStarted","Data":"11569582ae0aec4b711c2b691faf85c7e6c118572b8bc7f07d8dd5f7abec306b"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.082045 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9rfmg" event={"ID":"34801d76-ed04-44db-a572-a7a43067159d","Type":"ContainerStarted","Data":"ab68f1cf8b7f29e040f3585e61760d9b158708ea692f85ebe04ba2877efeab84"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.082220 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9rfmg" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.092397 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" event={"ID":"280cee19-adbf-4307-ac10-337b76f6b6d1","Type":"ContainerStarted","Data":"da409ec7e6526213d87b86559df271455aada2847dec663f781f0a498388c1f6"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.092439 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" event={"ID":"280cee19-adbf-4307-ac10-337b76f6b6d1","Type":"ContainerStarted","Data":"e39618af9cba4923b5a3b7225a5748459a1f2c7f3052f9ea91cd4f9b80216a12"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.098825 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.101611 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" event={"ID":"d1d6e1fe-3809-4973-bac7-6bdd60d74d74","Type":"ContainerStarted","Data":"1b88e7fa6914feef5a95a6d037ae0d7599b16a101eb6af71954c669682da18ad"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.104640 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" event={"ID":"5333eaad-5273-4b67-9876-a471945dbb76","Type":"ContainerStarted","Data":"97829287e1cbf50af7c73f5f803d6e39db90c5670673321bb062cd59f0b67173"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.104675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" event={"ID":"5333eaad-5273-4b67-9876-a471945dbb76","Type":"ContainerStarted","Data":"60460f9fb1c8e28909569ff74f20c5db6bbf1a902ec3ce06a9956edf1d5f109c"} Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.119324 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.119995 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.120656 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.620637883 +0000 UTC m=+121.527851234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.130485 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.131952 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-9rfmg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.132000 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9rfmg" podUID="34801d76-ed04-44db-a572-a7a43067159d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Nov 26 12:13:43 crc kubenswrapper[4834]: W1126 12:13:43.154777 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod262058ab_c269_4132_84e9_a244944bcc0d.slice/crio-003fa3f3dc1d2635c81da0dbfe3f70cf0dd45ea1bbf2f33cb56ad3f4660a9665 WatchSource:0}: Error finding container 003fa3f3dc1d2635c81da0dbfe3f70cf0dd45ea1bbf2f33cb56ad3f4660a9665: Status 404 returned error can't find the container with id 003fa3f3dc1d2635c81da0dbfe3f70cf0dd45ea1bbf2f33cb56ad3f4660a9665 Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.158188 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.168601 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.177802 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.190158 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb2n2\" (UniqueName: \"kubernetes.io/projected/ff730815-b4f6-48fb-b5e2-e9935e8b4bf8-kube-api-access-mb2n2\") pod \"etcd-operator-b45778765-stt44\" (UID: \"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.197799 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.206257 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.224251 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.224485 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.227876 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.727859559 +0000 UTC m=+121.635072900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.238142 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw"] Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.240706 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.244755 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nx9t\" (UniqueName: \"kubernetes.io/projected/4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9-kube-api-access-4nx9t\") pod \"router-default-5444994796-r462f\" (UID: \"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9\") " pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.253435 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6t7\" (UniqueName: \"kubernetes.io/projected/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-kube-api-access-kk6t7\") pod \"controller-manager-879f6c89f-vctqw\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.263541 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.287297 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.287500 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n7k2p" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.288644 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsm7g\" (UniqueName: \"kubernetes.io/projected/5cfef378-bc85-4056-bccd-fbfec5535325-kube-api-access-xsm7g\") pod \"dns-operator-744455d44c-8j9fr\" (UID: \"5cfef378-bc85-4056-bccd-fbfec5535325\") " pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.299533 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.308423 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.317102 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.326142 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.326369 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8"] Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.326562 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.826544152 +0000 UTC m=+121.733757504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.326912 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.327293 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.827285645 +0000 UTC m=+121.734498996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.340667 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.346375 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qxv\" (UniqueName: \"kubernetes.io/projected/a5dda160-6a44-4f03-b9a4-9baeeae03a54-kube-api-access-56qxv\") pod \"route-controller-manager-6576b87f9c-r28bw\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.354901 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns569\" (UniqueName: \"kubernetes.io/projected/b34fa680-13b0-492a-a80d-c2ef0709efb4-kube-api-access-ns569\") pod \"console-operator-58897d9998-tdqww\" (UID: \"b34fa680-13b0-492a-a80d-c2ef0709efb4\") " pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.357771 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.364832 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzlm9\" (UniqueName: \"kubernetes.io/projected/f8ffe635-f56c-4c0b-915a-1c5aab084305-kube-api-access-hzlm9\") pod \"ingress-canary-qf9rv\" (UID: \"f8ffe635-f56c-4c0b-915a-1c5aab084305\") " pod="openshift-ingress-canary/ingress-canary-qf9rv" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.379385 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.394066 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc5g6\" (UniqueName: \"kubernetes.io/projected/166b9d5c-74b2-4891-bbba-f4c2e81b683d-kube-api-access-vc5g6\") pod \"csi-hostpathplugin-pss4j\" (UID: \"166b9d5c-74b2-4891-bbba-f4c2e81b683d\") " pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.400880 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.415575 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.428006 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.428437 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:43.92841872 +0000 UTC m=+121.835632072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.436144 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz64t\" (UniqueName: \"kubernetes.io/projected/8501154d-b03b-41f2-9b7d-c2d834718044-kube-api-access-bz64t\") pod \"service-ca-9c57cc56f-bvjpq\" (UID: \"8501154d-b03b-41f2-9b7d-c2d834718044\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.437400 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.440109 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.456753 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.462577 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.497481 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.501893 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.519230 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.529385 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.530969 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.531705 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.031684522 +0000 UTC m=+121.938897874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.550483 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r"] Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.558050 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.558209 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.582360 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.589560 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.601855 4834 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.608529 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pss4j" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.620591 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2"] Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.621060 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.628110 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.633672 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.633883 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.133842127 +0000 UTC m=+122.041055478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.634075 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.634434 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.134425811 +0000 UTC m=+122.041639163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.635833 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl"] Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.642958 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.643152 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qf9rv" Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.739905 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.741331 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.241299428 +0000 UTC m=+122.148512780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.829036 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4"] Nov 26 12:13:43 crc kubenswrapper[4834]: W1126 12:13:43.829535 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad04846_9bd9_4d2a_bfb1_bf97c817e1f7.slice/crio-79b16b0121dd8a48be1d975bb7c9b42018ee6a84f003746477d06df545ad119c WatchSource:0}: Error finding container 79b16b0121dd8a48be1d975bb7c9b42018ee6a84f003746477d06df545ad119c: Status 404 returned error can't find the container with id 79b16b0121dd8a48be1d975bb7c9b42018ee6a84f003746477d06df545ad119c Nov 26 12:13:43 crc kubenswrapper[4834]: W1126 12:13:43.841946 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e332248_0575_4104_a6af_888110da84c0.slice/crio-38f42c58cdfe74d4c2b9d7c19743d561d6c3325f01ccc3f3fa2311b538f28718 WatchSource:0}: Error finding container 38f42c58cdfe74d4c2b9d7c19743d561d6c3325f01ccc3f3fa2311b538f28718: Status 404 returned error can't find the container with id 38f42c58cdfe74d4c2b9d7c19743d561d6c3325f01ccc3f3fa2311b538f28718 Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.842559 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.843828 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.343814287 +0000 UTC m=+122.251027640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.861420 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7"] Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.943409 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:43 crc kubenswrapper[4834]: E1126 12:13:43.943775 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.443760838 +0000 UTC m=+122.350974190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:43 crc kubenswrapper[4834]: I1126 12:13:43.989714 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jztgs"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.002123 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ztdb4"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.049770 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.050046 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.550035383 +0000 UTC m=+122.457248735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.083189 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.129731 4834 generic.go:334] "Generic (PLEG): container finished" podID="60f10e1e-0000-46ee-9c26-d79f680a79df" containerID="1c0dbe0650771ad1ca265cb7a8f99eade40501cb59c48675f21f49e048bea406" exitCode=0 Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.130333 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" event={"ID":"60f10e1e-0000-46ee-9c26-d79f680a79df","Type":"ContainerDied","Data":"1c0dbe0650771ad1ca265cb7a8f99eade40501cb59c48675f21f49e048bea406"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.132043 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" event={"ID":"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7","Type":"ContainerStarted","Data":"79b16b0121dd8a48be1d975bb7c9b42018ee6a84f003746477d06df545ad119c"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.133155 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" event={"ID":"12016219-99d2-4285-8112-a92858f97d1f","Type":"ContainerStarted","Data":"5680ed7a57cbd10a8bc98bc2171ec239c60d1f05766d06d434101a44c098e358"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.133175 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" event={"ID":"12016219-99d2-4285-8112-a92858f97d1f","Type":"ContainerStarted","Data":"96499467ed0c3b4fee5bf2a7225b3d7b7599d5528ef7f4f4ce2478a290f3ec8d"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.136995 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl" event={"ID":"4e332248-0575-4104-a6af-888110da84c0","Type":"ContainerStarted","Data":"38f42c58cdfe74d4c2b9d7c19743d561d6c3325f01ccc3f3fa2311b538f28718"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.142387 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" event={"ID":"243172e8-801d-435a-a091-c436cd119f1f","Type":"ContainerStarted","Data":"032b57baadda6c30e2f3316cba9715ec191946e712d0f1868d6f5528b2feab84"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.142832 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.150512 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.150836 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.650822884 +0000 UTC m=+122.558036237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.167728 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" event={"ID":"9a6cf357-27ce-4d2e-910a-b7223815b902","Type":"ContainerStarted","Data":"0bb895649f8d7213a0f6234a8ccc2a9e6b47c1790949bc94e9d4135e01cb6aab"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.182288 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" event={"ID":"262058ab-c269-4132-84e9-a244944bcc0d","Type":"ContainerStarted","Data":"19e0ea11a4f02f7558a28d815bf8056c6606061921c09165b5c138986ef64902"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.182373 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" event={"ID":"262058ab-c269-4132-84e9-a244944bcc0d","Type":"ContainerStarted","Data":"003fa3f3dc1d2635c81da0dbfe3f70cf0dd45ea1bbf2f33cb56ad3f4660a9665"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.191058 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" event={"ID":"9632af41-e005-4c92-b162-8a7fb760ac72","Type":"ContainerStarted","Data":"938b0326183ade76d58815c3d5fbfb0a23440690dba6734c3e63e3b358c27622"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.191102 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" event={"ID":"9632af41-e005-4c92-b162-8a7fb760ac72","Type":"ContainerStarted","Data":"1191f615fdbc3db83102bd23c68aad0db96bccadf9416c42ee33e5d4ade43a6f"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.192222 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.198823 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r462f" event={"ID":"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9","Type":"ContainerStarted","Data":"5049c6183096ad47c057f8cfed3e3b0d6ee4f84d74ca3292962f25ad65bdb35c"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.203461 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" event={"ID":"f23f3e03-50f8-4160-b1f9-6f92ce90c73c","Type":"ContainerStarted","Data":"1ffdab8485e86b70dbc9af2f0e99359df77c467cd9bb0b5a68df390d22fa765e"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.208393 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" event={"ID":"0d700278-027d-44e8-b8e3-c262fc0a6b44","Type":"ContainerStarted","Data":"89ec1c0ee07636709059a28371444ca859d261a5eb8627f4f91d3479b6c2305c"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.225239 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" event={"ID":"c4585ceb-bed5-4ab6-917e-15778b7a0f4d","Type":"ContainerStarted","Data":"b0277fd8474da3b571fb4516a213828f0e9b2ed79c7f77150caf6a107af0057a"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.230020 4834 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-w5jw8 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.230052 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" podUID="9632af41-e005-4c92-b162-8a7fb760ac72" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.243477 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" event={"ID":"a53f903b-5b01-467d-be5c-28aabf3dc48b","Type":"ContainerStarted","Data":"4de2e7ade573b82b5a1e733e5d93729f5cd625a74d4409e0b2010bc23d087bd6"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.243519 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" event={"ID":"a53f903b-5b01-467d-be5c-28aabf3dc48b","Type":"ContainerStarted","Data":"c6c4272727daa9083f419ef06a6ce70306cea6032cdb2c57c05efd5c3c6e0d21"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.250531 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" event={"ID":"2d5dd7f9-7abf-4c5e-ad19-96e010400267","Type":"ContainerStarted","Data":"bb44fde47c16a8f6ea639fb03a3ee00224be6e6453e6cd9a59f88e94372a342c"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.252813 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.255212 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.755189899 +0000 UTC m=+122.662403252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.280801 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" event={"ID":"a1ea3730-6a71-4f4c-8588-7e39021bdb3d","Type":"ContainerStarted","Data":"78bbb8c2abf637507e3c199000ae264de7f2dd091278a4e856e4d660366b9f2f"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.323790 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n7k2p" event={"ID":"fb88bb19-bdc2-44e9-9c43-3dc00e419563","Type":"ContainerStarted","Data":"48c8e6f13ec61b897a0043f6387a0ca1486ee3a02be7c21565b103f221f0ebf2"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.323871 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n7k2p" event={"ID":"fb88bb19-bdc2-44e9-9c43-3dc00e419563","Type":"ContainerStarted","Data":"8a37203553ceb41752d83628b5dbd613b4588ed8ced80f0ef51f2c9435785544"} Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.324432 4834 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pcv95 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.324492 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" podUID="6c485dff-89c6-4d40-8ba4-3b69ac68820e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.325286 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-9rfmg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.325357 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9rfmg" podUID="34801d76-ed04-44db-a572-a7a43067159d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.338782 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.353964 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.354116 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.854101382 +0000 UTC m=+122.761314733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.354209 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.356625 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.856607193 +0000 UTC m=+122.763820545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.358590 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8j9fr"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.361456 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr"] Nov 26 12:13:44 crc kubenswrapper[4834]: W1126 12:13:44.403058 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6470256_c870_4267_8e33_542f83d4f07c.slice/crio-8d814b0a2c5291605df387515aa0b032ef8754f98e0d47d9116c4ae36afcae55 WatchSource:0}: Error finding container 8d814b0a2c5291605df387515aa0b032ef8754f98e0d47d9116c4ae36afcae55: Status 404 returned error can't find the container with id 8d814b0a2c5291605df387515aa0b032ef8754f98e0d47d9116c4ae36afcae55 Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.460517 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.462385 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:44.96236542 +0000 UTC m=+122.869578772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.547365 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stt44"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.569246 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.569784 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.069771204 +0000 UTC m=+122.976984557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.569785 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.569940 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.569967 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.670071 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.670518 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.170504574 +0000 UTC m=+123.077717925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.697279 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vctqw"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.711118 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tdqww"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.744459 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tlgd5"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.771772 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.772163 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.272152514 +0000 UTC m=+123.179365866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: W1126 12:13:44.818297 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb34fa680_13b0_492a_a80d_c2ef0709efb4.slice/crio-888b962a5e6db0c4ce58293353f1e1eb974619e4e05df53ef63fb03a4112b734 WatchSource:0}: Error finding container 888b962a5e6db0c4ce58293353f1e1eb974619e4e05df53ef63fb03a4112b734: Status 404 returned error can't find the container with id 888b962a5e6db0c4ce58293353f1e1eb974619e4e05df53ef63fb03a4112b734 Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.824989 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.832302 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pss4j"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.859632 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bvjpq"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.875570 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.875971 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.375951925 +0000 UTC m=+123.283165277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.887040 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qf9rv"] Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.935988 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9w45j" podStartSLOduration=100.935965147 podStartE2EDuration="1m40.935965147s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:44.93455049 +0000 UTC m=+122.841763842" watchObservedRunningTime="2025-11-26 12:13:44.935965147 +0000 UTC m=+122.843178499" Nov 26 12:13:44 crc kubenswrapper[4834]: I1126 12:13:44.976666 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:44 crc kubenswrapper[4834]: E1126 12:13:44.977252 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.477233251 +0000 UTC m=+123.384446603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.018631 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-28tv5" podStartSLOduration=101.01861503 podStartE2EDuration="1m41.01861503s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.016700727 +0000 UTC m=+122.923914080" watchObservedRunningTime="2025-11-26 12:13:45.01861503 +0000 UTC m=+122.925828382" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.019563 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r462f" podStartSLOduration=101.019553456 podStartE2EDuration="1m41.019553456s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:44.972928175 +0000 UTC m=+122.880141517" watchObservedRunningTime="2025-11-26 12:13:45.019553456 +0000 UTC m=+122.926766808" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.078953 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:45 crc kubenswrapper[4834]: E1126 12:13:45.079369 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.57935626 +0000 UTC m=+123.486569613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.133588 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" podStartSLOduration=101.133569999 podStartE2EDuration="1m41.133569999s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.130716329 +0000 UTC m=+123.037929681" watchObservedRunningTime="2025-11-26 12:13:45.133569999 +0000 UTC m=+123.040783351" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.184084 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:45 crc kubenswrapper[4834]: E1126 12:13:45.184434 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.684420424 +0000 UTC m=+123.591633776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.286305 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:45 crc kubenswrapper[4834]: E1126 12:13:45.286590 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.786576786 +0000 UTC m=+123.693790139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.331804 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tdqww" event={"ID":"b34fa680-13b0-492a-a80d-c2ef0709efb4","Type":"ContainerStarted","Data":"672140dbdcd4e9537fd88f016121b1b5b57ead6ac582e948de6ef530586d666c"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.331845 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tdqww" event={"ID":"b34fa680-13b0-492a-a80d-c2ef0709efb4","Type":"ContainerStarted","Data":"888b962a5e6db0c4ce58293353f1e1eb974619e4e05df53ef63fb03a4112b734"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.331862 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.333815 4834 patch_prober.go:28] interesting pod/console-operator-58897d9998-tdqww container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.333849 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tdqww" podUID="b34fa680-13b0-492a-a80d-c2ef0709efb4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.335951 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" podStartSLOduration=101.335922434 podStartE2EDuration="1m41.335922434s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.333629836 +0000 UTC m=+123.240843188" watchObservedRunningTime="2025-11-26 12:13:45.335922434 +0000 UTC m=+123.243135786" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.342602 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" event={"ID":"2d5dd7f9-7abf-4c5e-ad19-96e010400267","Type":"ContainerStarted","Data":"85bc0e94f4dd8ce9db5b8f6501268975108bdedbbc6182761cc22218d22c747b"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.343781 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" event={"ID":"a5dda160-6a44-4f03-b9a4-9baeeae03a54","Type":"ContainerStarted","Data":"abd6518868a0a3ff91ab132aba9cb598dffdb99f2ee95047ab166a0be07041fe"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.346489 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" event={"ID":"c425edda-322b-4b12-bf02-48ceeeed9a59","Type":"ContainerStarted","Data":"1cf6868248611314a07416282d8d0af6e766f6463626790852bfb6fefb7eeb72"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.346517 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" event={"ID":"c425edda-322b-4b12-bf02-48ceeeed9a59","Type":"ContainerStarted","Data":"567a4b0663ed98c1d516c48b2dec459050975ec820a8d586a1019d8c972bf25c"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.353176 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" event={"ID":"8501154d-b03b-41f2-9b7d-c2d834718044","Type":"ContainerStarted","Data":"9c771fbb1ec3518d8d72d72e1ee9505b684bee43e4f1fae8c6bb4c7e2ab96bbd"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.353246 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" event={"ID":"8501154d-b03b-41f2-9b7d-c2d834718044","Type":"ContainerStarted","Data":"ce9daf857e372447ddca61a1d8c5adcec32d246f68b1e6c09b1a94852887b3da"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.371114 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pss4j" event={"ID":"166b9d5c-74b2-4891-bbba-f4c2e81b683d","Type":"ContainerStarted","Data":"9d1b89930608502ef90ed4cc131aacf28df8e3fef4912c97874e2f03178fabc3"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.381186 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" event={"ID":"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8","Type":"ContainerStarted","Data":"ce9fa117e1b6e30dd83cf6a092d5860714e614eaee452442bc869b8c624bf835"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.381234 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" event={"ID":"ff730815-b4f6-48fb-b5e2-e9935e8b4bf8","Type":"ContainerStarted","Data":"eb5a91cb7183f12bdc4bf29b24d7076c88062a8dd4bc2440aefdcb306f438766"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.387123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:45 crc kubenswrapper[4834]: E1126 12:13:45.387527 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.887512969 +0000 UTC m=+123.794726321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.393153 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" event={"ID":"9a6cf357-27ce-4d2e-910a-b7223815b902","Type":"ContainerStarted","Data":"154335f537d897b4720e3477665a9fc2110440bdbdc5b7851d216405376477f6"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.408077 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.427466 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.440474 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" event={"ID":"0ad04846-9bd9-4d2a-bfb1-bf97c817e1f7","Type":"ContainerStarted","Data":"1edc0e8b07809e8a3603226285c9c7b3b2d674bf8ffa8a51be000a3a80c9b76f"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.446419 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-z2dpf" podStartSLOduration=101.44640002 podStartE2EDuration="1m41.44640002s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.436668426 +0000 UTC m=+123.343881779" watchObservedRunningTime="2025-11-26 12:13:45.44640002 +0000 UTC m=+123.353613372" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.456781 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" event={"ID":"94e98907-4981-47a2-b4ae-fa83a1a6e9ac","Type":"ContainerStarted","Data":"0acb7e2723c49d198d919cac3d73caef8f19e1c375635dea13b0dbb0c365a575"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.456825 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" event={"ID":"94e98907-4981-47a2-b4ae-fa83a1a6e9ac","Type":"ContainerStarted","Data":"4188e325f7f1709c7308e8b5ce53c8c63d1b91517a8204e71d6be0d642a7c437"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.457417 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.461767 4834 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vctqw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.461811 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" podUID="94e98907-4981-47a2-b4ae-fa83a1a6e9ac" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.479853 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl" event={"ID":"4e332248-0575-4104-a6af-888110da84c0","Type":"ContainerStarted","Data":"63c3977717f0a36c0eab340c42401d4217bf209346279483a24a19494e80535d"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.479893 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl" event={"ID":"4e332248-0575-4104-a6af-888110da84c0","Type":"ContainerStarted","Data":"02c850e5f06444151a2d011e9b3604ce79eeb738566bcb7aa24ffb2b891cdfe3"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.490055 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.491620 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9rfmg" podStartSLOduration=101.491602599 podStartE2EDuration="1m41.491602599s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.456551297 +0000 UTC m=+123.363764649" watchObservedRunningTime="2025-11-26 12:13:45.491602599 +0000 UTC m=+123.398815951" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.491905 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" podStartSLOduration=101.491900713 podStartE2EDuration="1m41.491900713s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.489675873 +0000 UTC m=+123.396889225" watchObservedRunningTime="2025-11-26 12:13:45.491900713 +0000 UTC m=+123.399114065" Nov 26 12:13:45 crc kubenswrapper[4834]: E1126 12:13:45.496604 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:45.996582111 +0000 UTC m=+123.903795463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.514090 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" event={"ID":"60f10e1e-0000-46ee-9c26-d79f680a79df","Type":"ContainerStarted","Data":"ef47cb1c54b109cf94825d315354643d9151851b4f44af4c40281081c97d26ff"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.519703 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" event={"ID":"f23f3e03-50f8-4160-b1f9-6f92ce90c73c","Type":"ContainerStarted","Data":"8cf2742936a947b5ac7d4ac480dc0c184e810b5d77ff7e43cd62406370c26a15"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.520227 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" event={"ID":"f23f3e03-50f8-4160-b1f9-6f92ce90c73c","Type":"ContainerStarted","Data":"e3af7160ad83ec0790b903ed421ffc04a30e01af793352f14167f8aced253873"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.531753 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhx2d" podStartSLOduration=101.531682374 podStartE2EDuration="1m41.531682374s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.530589287 +0000 UTC m=+123.437802639" watchObservedRunningTime="2025-11-26 12:13:45.531682374 +0000 UTC m=+123.438895716" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.554338 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" event={"ID":"0d700278-027d-44e8-b8e3-c262fc0a6b44","Type":"ContainerStarted","Data":"42526c283ceb26ab14680ceb86bf7c3cbef19cfa82cfa01a247eb1580ddb3998"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.554372 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" event={"ID":"0d700278-027d-44e8-b8e3-c262fc0a6b44","Type":"ContainerStarted","Data":"5e20080f441c8dec53a83fe6290bf06ecf61f7ac775d72af0b4a80911f255359"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.564269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" event={"ID":"12016219-99d2-4285-8112-a92858f97d1f","Type":"ContainerStarted","Data":"7a3e6c9fcaf4e4011cc23113ea15663a97d49b253c19c7b6d72e9f58c1d31f43"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.564695 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.566091 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:45 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:45 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:45 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.566127 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.580258 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k6bw9" podStartSLOduration=101.580248287 podStartE2EDuration="1m41.580248287s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.57929312 +0000 UTC m=+123.486506472" watchObservedRunningTime="2025-11-26 12:13:45.580248287 +0000 UTC m=+123.487461628" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.585500 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" event={"ID":"5cfef378-bc85-4056-bccd-fbfec5535325","Type":"ContainerStarted","Data":"ae9cf08f6a8ed8ef32fd1ee17e994eb935af37cf186eb00c509d5d57e1a07d5a"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.585545 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" event={"ID":"5cfef378-bc85-4056-bccd-fbfec5535325","Type":"ContainerStarted","Data":"bba8c1604819008b2615fe3c8d33b9273348899d556b8e5c383c3cc6800321ee"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.592469 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:45 crc kubenswrapper[4834]: E1126 12:13:45.594071 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.09405959 +0000 UTC m=+124.001272942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.615719 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xhdnp" podStartSLOduration=101.615709033 podStartE2EDuration="1m41.615709033s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.613880955 +0000 UTC m=+123.521094296" watchObservedRunningTime="2025-11-26 12:13:45.615709033 +0000 UTC m=+123.522922385" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.628228 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r462f" event={"ID":"4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9","Type":"ContainerStarted","Data":"fc9315c8203f08647e5824077563ed9c0e577c86415381fd9402287fdb37cd81"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.640847 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qf9rv" event={"ID":"f8ffe635-f56c-4c0b-915a-1c5aab084305","Type":"ContainerStarted","Data":"aef40caa7d1da9b5c7e7a6e84f3070b842573a80eacb2ca0f192aeb1f924aae6"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.642251 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" event={"ID":"059853c5-abf6-459b-9719-cc6688fd6e0a","Type":"ContainerStarted","Data":"998da184763ac4da53ffdf4e90a109b7343094739ecd30bf8c142d46fabe1df2"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.643278 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" event={"ID":"a53f903b-5b01-467d-be5c-28aabf3dc48b","Type":"ContainerStarted","Data":"c4bf579e3ed2cb359e71feef09d95f6d39b2c53625a6140ab6ad9b34f3dc19a0"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.645396 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ztdb4" event={"ID":"a25744d0-f7b9-4349-a07c-9655dd2a84c2","Type":"ContainerStarted","Data":"f15116efb98d89347ddd2ec35b18fb0e7aef0308c12f0397b4c3fb7f4abe448a"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.645421 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ztdb4" event={"ID":"a25744d0-f7b9-4349-a07c-9655dd2a84c2","Type":"ContainerStarted","Data":"72c6e62705c40065d0513c348b80b5831d7b209da31bed48d94de930e4470e4e"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.645737 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.648285 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" event={"ID":"e6470256-c870-4267-8e33-542f83d4f07c","Type":"ContainerStarted","Data":"14bb4347de81d00ef3e113ecb9f120a1db61c7170ff81a94d2b5a366bdf527c5"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.648646 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" event={"ID":"e6470256-c870-4267-8e33-542f83d4f07c","Type":"ContainerStarted","Data":"8d814b0a2c5291605df387515aa0b032ef8754f98e0d47d9116c4ae36afcae55"} Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.660567 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w5jw8" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.665590 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.696871 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:45 crc kubenswrapper[4834]: E1126 12:13:45.698207 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.198192491 +0000 UTC m=+124.105405844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.739039 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" podStartSLOduration=101.739021114 podStartE2EDuration="1m41.739021114s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.696334245 +0000 UTC m=+123.603547597" watchObservedRunningTime="2025-11-26 12:13:45.739021114 +0000 UTC m=+123.646234466" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.774838 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-w4l9d" podStartSLOduration=101.774816343 podStartE2EDuration="1m41.774816343s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.739721099 +0000 UTC m=+123.646934450" watchObservedRunningTime="2025-11-26 12:13:45.774816343 +0000 UTC m=+123.682029684" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.798424 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:45 crc kubenswrapper[4834]: E1126 12:13:45.798682 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.2986719 +0000 UTC m=+124.205885252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.805425 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hpfzc" podStartSLOduration=101.805403527 podStartE2EDuration="1m41.805403527s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.777381486 +0000 UTC m=+123.684594839" watchObservedRunningTime="2025-11-26 12:13:45.805403527 +0000 UTC m=+123.712616879" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.905017 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zl8n8" podStartSLOduration=101.904995899 podStartE2EDuration="1m41.904995899s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.858560487 +0000 UTC m=+123.765773839" watchObservedRunningTime="2025-11-26 12:13:45.904995899 +0000 UTC m=+123.812209250" Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.906859 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:45 crc kubenswrapper[4834]: E1126 12:13:45.907268 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.407254131 +0000 UTC m=+124.314467482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:45 crc kubenswrapper[4834]: I1126 12:13:45.975081 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l74fz" podStartSLOduration=101.975054587 podStartE2EDuration="1m41.975054587s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:45.952650556 +0000 UTC m=+123.859863908" watchObservedRunningTime="2025-11-26 12:13:45.975054587 +0000 UTC m=+123.882267939" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.008938 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.009211 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.509199864 +0000 UTC m=+124.416413216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.023262 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtjlx" podStartSLOduration=102.023240231 podStartE2EDuration="1m42.023240231s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.02023318 +0000 UTC m=+123.927446532" watchObservedRunningTime="2025-11-26 12:13:46.023240231 +0000 UTC m=+123.930453582" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.058221 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" podStartSLOduration=102.058195751 podStartE2EDuration="1m42.058195751s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.05672122 +0000 UTC m=+123.963934572" watchObservedRunningTime="2025-11-26 12:13:46.058195751 +0000 UTC m=+123.965409103" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.110564 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.110705 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.61067674 +0000 UTC m=+124.517890092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.110944 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.111294 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.611280343 +0000 UTC m=+124.518493695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.133846 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-n7k2p" podStartSLOduration=7.133829618 podStartE2EDuration="7.133829618s" podCreationTimestamp="2025-11-26 12:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.132733484 +0000 UTC m=+124.039946836" watchObservedRunningTime="2025-11-26 12:13:46.133829618 +0000 UTC m=+124.041042961" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.134344 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" podStartSLOduration=102.134339673 podStartE2EDuration="1m42.134339673s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.090857199 +0000 UTC m=+123.998070551" watchObservedRunningTime="2025-11-26 12:13:46.134339673 +0000 UTC m=+124.041553026" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.166299 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg6qr" podStartSLOduration=102.1662913 podStartE2EDuration="1m42.1662913s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.165574373 +0000 UTC m=+124.072787725" watchObservedRunningTime="2025-11-26 12:13:46.1662913 +0000 UTC m=+124.073504651" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.212478 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.212619 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.712602756 +0000 UTC m=+124.619816108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.212904 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.213244 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.713230996 +0000 UTC m=+124.620444347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.240171 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-stt44" podStartSLOduration=102.240148396 podStartE2EDuration="1m42.240148396s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.238615065 +0000 UTC m=+124.145828416" watchObservedRunningTime="2025-11-26 12:13:46.240148396 +0000 UTC m=+124.147361748" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.292820 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xlp7" podStartSLOduration=102.29278457 podStartE2EDuration="1m42.29278457s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.265786958 +0000 UTC m=+124.173000309" watchObservedRunningTime="2025-11-26 12:13:46.29278457 +0000 UTC m=+124.199997922" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.314153 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.314263 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.814245757 +0000 UTC m=+124.721459109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.314494 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.314789 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.81478096 +0000 UTC m=+124.721994312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.372225 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" podStartSLOduration=102.372203831 podStartE2EDuration="1m42.372203831s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.346664768 +0000 UTC m=+124.253878120" watchObservedRunningTime="2025-11-26 12:13:46.372203831 +0000 UTC m=+124.279417182" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.399068 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ssd6t"] Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.400075 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.415386 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.415548 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.915524308 +0000 UTC m=+124.822737661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.415925 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.416369 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:46.916349349 +0000 UTC m=+124.823562702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.417055 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bq2xl" podStartSLOduration=102.417032351 podStartE2EDuration="1m42.417032351s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.413859037 +0000 UTC m=+124.321072389" watchObservedRunningTime="2025-11-26 12:13:46.417032351 +0000 UTC m=+124.324245703" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.423654 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.429413 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ssd6t"] Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.471694 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ztdb4" podStartSLOduration=7.471674852 podStartE2EDuration="7.471674852s" podCreationTimestamp="2025-11-26 12:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.469494146 +0000 UTC m=+124.376707498" watchObservedRunningTime="2025-11-26 12:13:46.471674852 +0000 UTC m=+124.378888194" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.516866 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.517038 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.017005493 +0000 UTC m=+124.924218845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.517212 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-utilities\") pod \"certified-operators-ssd6t\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.517273 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mr2h\" (UniqueName: \"kubernetes.io/projected/498090a1-7333-4c7f-b48b-ae3ea0166165-kube-api-access-8mr2h\") pod \"certified-operators-ssd6t\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.517360 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-catalog-content\") pod \"certified-operators-ssd6t\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.517395 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.517813 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.017794225 +0000 UTC m=+124.925007577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.559291 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dhk4" podStartSLOduration=102.559278448 podStartE2EDuration="1m42.559278448s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.558654317 +0000 UTC m=+124.465867670" watchObservedRunningTime="2025-11-26 12:13:46.559278448 +0000 UTC m=+124.466491800" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.560495 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-bvjpq" podStartSLOduration=102.560490151 podStartE2EDuration="1m42.560490151s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.507178198 +0000 UTC m=+124.414391550" watchObservedRunningTime="2025-11-26 12:13:46.560490151 +0000 UTC m=+124.467703503" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.562304 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:46 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:46 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:46 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.562587 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.598351 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-tdqww" podStartSLOduration=102.598343393 podStartE2EDuration="1m42.598343393s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.593068031 +0000 UTC m=+124.500281384" watchObservedRunningTime="2025-11-26 12:13:46.598343393 +0000 UTC m=+124.505556745" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.606758 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qsb7t"] Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.607498 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.618247 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.618393 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-catalog-content\") pod \"certified-operators-ssd6t\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.618472 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-utilities\") pod \"certified-operators-ssd6t\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.618497 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mr2h\" (UniqueName: \"kubernetes.io/projected/498090a1-7333-4c7f-b48b-ae3ea0166165-kube-api-access-8mr2h\") pod \"certified-operators-ssd6t\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.618663 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.118652832 +0000 UTC m=+125.025866184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.619188 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-catalog-content\") pod \"certified-operators-ssd6t\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.619555 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-utilities\") pod \"certified-operators-ssd6t\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.627759 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qsb7t"] Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.640099 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.660904 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" event={"ID":"059853c5-abf6-459b-9719-cc6688fd6e0a","Type":"ContainerStarted","Data":"38c596597183f827c7e18809812be0d2235c2fdf1e2592eaa2795b2b2776bd86"} Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.660954 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" event={"ID":"059853c5-abf6-459b-9719-cc6688fd6e0a","Type":"ContainerStarted","Data":"03f852a414326a5d59aa932bf8ea7a0a61f85bca3dfdc2d6811e981b4b67744c"} Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.662757 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" event={"ID":"a5dda160-6a44-4f03-b9a4-9baeeae03a54","Type":"ContainerStarted","Data":"188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27"} Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.663259 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.668159 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.676210 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mr2h\" (UniqueName: \"kubernetes.io/projected/498090a1-7333-4c7f-b48b-ae3ea0166165-kube-api-access-8mr2h\") pod \"certified-operators-ssd6t\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.676378 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pss4j" event={"ID":"166b9d5c-74b2-4891-bbba-f4c2e81b683d","Type":"ContainerStarted","Data":"eccac2a9cc61db6006faa763aee867ee8b35efa9286375f96bcfdb1270f1617c"} Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.689235 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" event={"ID":"5cfef378-bc85-4056-bccd-fbfec5535325","Type":"ContainerStarted","Data":"8fba5cd90dd135a2ffdfa48f8e4f75ed70b21fe5bc3ec7d614da3e4e5993aff7"} Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.694549 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ztdb4" event={"ID":"a25744d0-f7b9-4349-a07c-9655dd2a84c2","Type":"ContainerStarted","Data":"fb5d80aac790e2168cb21e08c89170aebba37737fcf467087783726296175790"} Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.696380 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qf9rv" event={"ID":"f8ffe635-f56c-4c0b-915a-1c5aab084305","Type":"ContainerStarted","Data":"604e31a2e1db6066c69ef21190503793bfc78aec3b99c31f6d3ab686729e4869"} Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.713015 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.733637 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsjw\" (UniqueName: \"kubernetes.io/projected/df50fe7e-b188-43c2-bfeb-e82b35741ad1-kube-api-access-kgsjw\") pod \"community-operators-qsb7t\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.733704 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-utilities\") pod \"community-operators-qsb7t\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.733725 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-catalog-content\") pod \"community-operators-qsb7t\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.733813 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.735217 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.235204381 +0000 UTC m=+125.142417733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.741864 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" event={"ID":"60f10e1e-0000-46ee-9c26-d79f680a79df","Type":"ContainerStarted","Data":"2ce21af5ef629ad4bd91e1f5b14a71ca8ea5ed473df268a7525c3c4bfdadfa6e"} Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.762445 4834 generic.go:334] "Generic (PLEG): container finished" podID="280cee19-adbf-4307-ac10-337b76f6b6d1" containerID="da409ec7e6526213d87b86559df271455aada2847dec663f781f0a498388c1f6" exitCode=0 Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.762922 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" event={"ID":"280cee19-adbf-4307-ac10-337b76f6b6d1","Type":"ContainerDied","Data":"da409ec7e6526213d87b86559df271455aada2847dec663f781f0a498388c1f6"} Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.774094 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" podStartSLOduration=102.77408091 podStartE2EDuration="1m42.77408091s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.629969234 +0000 UTC m=+124.537182586" watchObservedRunningTime="2025-11-26 12:13:46.77408091 +0000 UTC m=+124.681294261" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.783162 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.790575 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-tdqww" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.790653 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c5m8m" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.808509 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" podStartSLOduration=102.808499093 podStartE2EDuration="1m42.808499093s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.777498196 +0000 UTC m=+124.684711538" watchObservedRunningTime="2025-11-26 12:13:46.808499093 +0000 UTC m=+124.715712445" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.809519 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jdszs"] Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.810464 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.816443 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.818354 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.834446 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.834748 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.835165 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsjw\" (UniqueName: \"kubernetes.io/projected/df50fe7e-b188-43c2-bfeb-e82b35741ad1-kube-api-access-kgsjw\") pod \"community-operators-qsb7t\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.835288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-utilities\") pod \"community-operators-qsb7t\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.835337 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-catalog-content\") pod \"community-operators-qsb7t\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.836708 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.336694612 +0000 UTC m=+125.243907964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.838966 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-utilities\") pod \"community-operators-qsb7t\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.839561 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-catalog-content\") pod \"community-operators-qsb7t\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.861539 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zbj9r" podStartSLOduration=102.861529232 podStartE2EDuration="1m42.861529232s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.852164013 +0000 UTC m=+124.759377364" watchObservedRunningTime="2025-11-26 12:13:46.861529232 +0000 UTC m=+124.768742584" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.865604 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdszs"] Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.887954 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsjw\" (UniqueName: \"kubernetes.io/projected/df50fe7e-b188-43c2-bfeb-e82b35741ad1-kube-api-access-kgsjw\") pod \"community-operators-qsb7t\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.921622 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.937011 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.937099 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-utilities\") pod \"certified-operators-jdszs\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.937234 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-catalog-content\") pod \"certified-operators-jdszs\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.937262 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc9pd\" (UniqueName: \"kubernetes.io/projected/88a98958-391d-4c4a-9624-b52c51638e12-kube-api-access-gc9pd\") pod \"certified-operators-jdszs\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:46 crc kubenswrapper[4834]: E1126 12:13:46.937602 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.437587925 +0000 UTC m=+125.344801276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.954380 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jztgs" podStartSLOduration=102.954361111 podStartE2EDuration="1m42.954361111s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.886669822 +0000 UTC m=+124.793883173" watchObservedRunningTime="2025-11-26 12:13:46.954361111 +0000 UTC m=+124.861574464" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.956350 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qf9rv" podStartSLOduration=7.956342941 podStartE2EDuration="7.956342941s" podCreationTimestamp="2025-11-26 12:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.953697907 +0000 UTC m=+124.860911259" watchObservedRunningTime="2025-11-26 12:13:46.956342941 +0000 UTC m=+124.863556294" Nov 26 12:13:46 crc kubenswrapper[4834]: I1126 12:13:46.987555 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bkfhv" podStartSLOduration=102.987543367 podStartE2EDuration="1m42.987543367s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:46.984546726 +0000 UTC m=+124.891760078" watchObservedRunningTime="2025-11-26 12:13:46.987543367 +0000 UTC m=+124.894756718" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.038473 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.038689 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-catalog-content\") pod \"certified-operators-jdszs\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.038722 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc9pd\" (UniqueName: \"kubernetes.io/projected/88a98958-391d-4c4a-9624-b52c51638e12-kube-api-access-gc9pd\") pod \"certified-operators-jdszs\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.038774 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-utilities\") pod \"certified-operators-jdszs\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.039424 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-utilities\") pod \"certified-operators-jdszs\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.039490 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.53947714 +0000 UTC m=+125.446690493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.039893 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-catalog-content\") pod \"certified-operators-jdszs\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.047435 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ktqqx"] Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.048269 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.048636 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j2jb2" podStartSLOduration=103.048608538 podStartE2EDuration="1m43.048608538s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:47.036648058 +0000 UTC m=+124.943861410" watchObservedRunningTime="2025-11-26 12:13:47.048608538 +0000 UTC m=+124.955821891" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.089855 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc9pd\" (UniqueName: \"kubernetes.io/projected/88a98958-391d-4c4a-9624-b52c51638e12-kube-api-access-gc9pd\") pod \"certified-operators-jdszs\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.092950 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktqqx"] Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.115990 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5flwm" podStartSLOduration=103.115970566 podStartE2EDuration="1m43.115970566s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:47.084264354 +0000 UTC m=+124.991477706" watchObservedRunningTime="2025-11-26 12:13:47.115970566 +0000 UTC m=+125.023183917" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.143272 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-catalog-content\") pod \"community-operators-ktqqx\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.143370 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.143402 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-utilities\") pod \"community-operators-ktqqx\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.143427 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdflf\" (UniqueName: \"kubernetes.io/projected/e9cd17a8-2071-42d3-a704-744ff5e8b53a-kube-api-access-kdflf\") pod \"community-operators-ktqqx\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.143719 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.643706345 +0000 UTC m=+125.550919697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.164485 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.246200 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.246388 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.746356561 +0000 UTC m=+125.653569914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.246632 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.246890 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-utilities\") pod \"community-operators-ktqqx\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.246940 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdflf\" (UniqueName: \"kubernetes.io/projected/e9cd17a8-2071-42d3-a704-744ff5e8b53a-kube-api-access-kdflf\") pod \"community-operators-ktqqx\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.247571 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-utilities\") pod \"community-operators-ktqqx\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.247671 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.747663474 +0000 UTC m=+125.654876826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.247140 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-catalog-content\") pod \"community-operators-ktqqx\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.252584 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-catalog-content\") pod \"community-operators-ktqqx\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.258234 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8j9fr" podStartSLOduration=103.258214337 podStartE2EDuration="1m43.258214337s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:47.250749553 +0000 UTC m=+125.157962906" watchObservedRunningTime="2025-11-26 12:13:47.258214337 +0000 UTC m=+125.165427689" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.292520 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.293064 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.305989 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdflf\" (UniqueName: \"kubernetes.io/projected/e9cd17a8-2071-42d3-a704-744ff5e8b53a-kube-api-access-kdflf\") pod \"community-operators-ktqqx\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.313580 4834 patch_prober.go:28] interesting pod/apiserver-76f77b778f-hvfh2 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]log ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]etcd ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/generic-apiserver-start-informers ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/max-in-flight-filter ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 26 12:13:47 crc kubenswrapper[4834]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 26 12:13:47 crc kubenswrapper[4834]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/project.openshift.io-projectcache ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/openshift.io-startinformers ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 26 12:13:47 crc kubenswrapper[4834]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 26 12:13:47 crc kubenswrapper[4834]: livez check failed Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.315728 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" podUID="60f10e1e-0000-46ee-9c26-d79f680a79df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.316414 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-tlgd5" podStartSLOduration=103.316394732 podStartE2EDuration="1m43.316394732s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:47.316196347 +0000 UTC m=+125.223409699" watchObservedRunningTime="2025-11-26 12:13:47.316394732 +0000 UTC m=+125.223608084" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.352081 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.352658 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.85264434 +0000 UTC m=+125.759857692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.385618 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.456961 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.457010 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" podStartSLOduration=103.456992139 podStartE2EDuration="1m43.456992139s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:47.42706868 +0000 UTC m=+125.334282032" watchObservedRunningTime="2025-11-26 12:13:47.456992139 +0000 UTC m=+125.364205490" Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.457296 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:47.957285494 +0000 UTC m=+125.864498845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.464452 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ssd6t"] Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.558690 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.559249 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.059218734 +0000 UTC m=+125.966432086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.574008 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:47 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:47 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:47 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.574044 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.608427 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qsb7t"] Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.624069 4834 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.660914 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.661656 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.161624207 +0000 UTC m=+126.068837559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.695714 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdszs"] Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.763340 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.764431 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.264407194 +0000 UTC m=+126.171620546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.798867 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pss4j" event={"ID":"166b9d5c-74b2-4891-bbba-f4c2e81b683d","Type":"ContainerStarted","Data":"6c71c7a25aa0a784891aab7ce54059233504b7969b07381ffb66a857cc5a12c8"} Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.798923 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pss4j" event={"ID":"166b9d5c-74b2-4891-bbba-f4c2e81b683d","Type":"ContainerStarted","Data":"db8963535424610d535d240879cf6e6df011bb9594082208cafd6e383cfee4cd"} Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.807552 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsb7t" event={"ID":"df50fe7e-b188-43c2-bfeb-e82b35741ad1","Type":"ContainerStarted","Data":"fe1b6a8f11e7ee2374140471a2e03a8a08c682f18298415d6ce56637e45724d2"} Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.809068 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdszs" event={"ID":"88a98958-391d-4c4a-9624-b52c51638e12","Type":"ContainerStarted","Data":"0380f92dee94bb46c86902979539c434f09c6bb23bfe213036c95e202d89ddb1"} Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.811789 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssd6t" event={"ID":"498090a1-7333-4c7f-b48b-ae3ea0166165","Type":"ContainerStarted","Data":"e11e08d133a8439dc89c67f61e0c1e8c56658f03842abd08f799390e8252ad27"} Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.829442 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-625th" Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.865478 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.867968 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.367952495 +0000 UTC m=+126.275165847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.874141 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktqqx"] Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.966643 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.967056 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.467020413 +0000 UTC m=+126.374233764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:47 crc kubenswrapper[4834]: I1126 12:13:47.968115 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:47 crc kubenswrapper[4834]: E1126 12:13:47.969708 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.469695103 +0000 UTC m=+126.376908455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.069475 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:48 crc kubenswrapper[4834]: E1126 12:13:48.069662 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.56962915 +0000 UTC m=+126.476842503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.069811 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:48 crc kubenswrapper[4834]: E1126 12:13:48.070167 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.570159704 +0000 UTC m=+126.477373057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.170567 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:48 crc kubenswrapper[4834]: E1126 12:13:48.170986 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.67096967 +0000 UTC m=+126.578183021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.197909 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.199850 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.202812 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.203124 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.212041 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.275454 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:48 crc kubenswrapper[4834]: E1126 12:13:48.275796 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.775780944 +0000 UTC m=+126.682994296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.276109 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.276141 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.279440 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.376568 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/280cee19-adbf-4307-ac10-337b76f6b6d1-config-volume\") pod \"280cee19-adbf-4307-ac10-337b76f6b6d1\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.376632 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vcj6\" (UniqueName: \"kubernetes.io/projected/280cee19-adbf-4307-ac10-337b76f6b6d1-kube-api-access-4vcj6\") pod \"280cee19-adbf-4307-ac10-337b76f6b6d1\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.376776 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.376799 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/280cee19-adbf-4307-ac10-337b76f6b6d1-secret-volume\") pod \"280cee19-adbf-4307-ac10-337b76f6b6d1\" (UID: \"280cee19-adbf-4307-ac10-337b76f6b6d1\") " Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.377027 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.377165 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.377248 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.377900 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/280cee19-adbf-4307-ac10-337b76f6b6d1-config-volume" (OuterVolumeSpecName: "config-volume") pod "280cee19-adbf-4307-ac10-337b76f6b6d1" (UID: "280cee19-adbf-4307-ac10-337b76f6b6d1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:13:48 crc kubenswrapper[4834]: E1126 12:13:48.378392 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.87836755 +0000 UTC m=+126.785580902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.384697 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280cee19-adbf-4307-ac10-337b76f6b6d1-kube-api-access-4vcj6" (OuterVolumeSpecName: "kube-api-access-4vcj6") pod "280cee19-adbf-4307-ac10-337b76f6b6d1" (UID: "280cee19-adbf-4307-ac10-337b76f6b6d1"). InnerVolumeSpecName "kube-api-access-4vcj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.384765 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280cee19-adbf-4307-ac10-337b76f6b6d1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "280cee19-adbf-4307-ac10-337b76f6b6d1" (UID: "280cee19-adbf-4307-ac10-337b76f6b6d1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.394936 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.478146 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.478218 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/280cee19-adbf-4307-ac10-337b76f6b6d1-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.478243 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/280cee19-adbf-4307-ac10-337b76f6b6d1-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.478254 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vcj6\" (UniqueName: \"kubernetes.io/projected/280cee19-adbf-4307-ac10-337b76f6b6d1-kube-api-access-4vcj6\") on node \"crc\" DevicePath \"\"" Nov 26 12:13:48 crc kubenswrapper[4834]: E1126 12:13:48.478534 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 12:13:48.978517196 +0000 UTC m=+126.885730548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9cvdp" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.536004 4834 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-26T12:13:47.624087553Z","Handler":null,"Name":""} Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.543642 4834 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.543681 4834 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.554301 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.562203 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:48 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:48 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:48 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.562268 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.579582 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.584293 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wr48n"] Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.584472 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 26 12:13:48 crc kubenswrapper[4834]: E1126 12:13:48.584525 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280cee19-adbf-4307-ac10-337b76f6b6d1" containerName="collect-profiles" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.584538 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="280cee19-adbf-4307-ac10-337b76f6b6d1" containerName="collect-profiles" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.584644 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="280cee19-adbf-4307-ac10-337b76f6b6d1" containerName="collect-profiles" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.585331 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.588089 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.597759 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr48n"] Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.682109 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.682268 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-utilities\") pod \"redhat-marketplace-wr48n\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.682299 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-catalog-content\") pod \"redhat-marketplace-wr48n\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.682359 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p5mb\" (UniqueName: \"kubernetes.io/projected/14803438-3861-4da4-a23f-798966cbb5e4-kube-api-access-9p5mb\") pod \"redhat-marketplace-wr48n\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.686581 4834 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.686622 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.718694 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9cvdp\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.752300 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.761794 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.783603 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-utilities\") pod \"redhat-marketplace-wr48n\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.783646 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-catalog-content\") pod \"redhat-marketplace-wr48n\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.783662 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p5mb\" (UniqueName: \"kubernetes.io/projected/14803438-3861-4da4-a23f-798966cbb5e4-kube-api-access-9p5mb\") pod \"redhat-marketplace-wr48n\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.784253 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-utilities\") pod \"redhat-marketplace-wr48n\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.784492 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-catalog-content\") pod \"redhat-marketplace-wr48n\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.800623 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p5mb\" (UniqueName: \"kubernetes.io/projected/14803438-3861-4da4-a23f-798966cbb5e4-kube-api-access-9p5mb\") pod \"redhat-marketplace-wr48n\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.806724 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 12:13:48 crc kubenswrapper[4834]: W1126 12:13:48.809826 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5834d4a5_27c9_46d1_90b9_bfe5fbbed654.slice/crio-44cdaab1fb73d6a4813d9a73b8ecf18a6396434fc0937e1f0a38e5ce91c41f3c WatchSource:0}: Error finding container 44cdaab1fb73d6a4813d9a73b8ecf18a6396434fc0937e1f0a38e5ce91c41f3c: Status 404 returned error can't find the container with id 44cdaab1fb73d6a4813d9a73b8ecf18a6396434fc0937e1f0a38e5ce91c41f3c Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.816131 4834 generic.go:334] "Generic (PLEG): container finished" podID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerID="f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65" exitCode=0 Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.816332 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssd6t" event={"ID":"498090a1-7333-4c7f-b48b-ae3ea0166165","Type":"ContainerDied","Data":"f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65"} Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.817684 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" event={"ID":"280cee19-adbf-4307-ac10-337b76f6b6d1","Type":"ContainerDied","Data":"e39618af9cba4923b5a3b7225a5748459a1f2c7f3052f9ea91cd4f9b80216a12"} Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.817714 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e39618af9cba4923b5a3b7225a5748459a1f2c7f3052f9ea91cd4f9b80216a12" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.817779 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.823080 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5834d4a5-27c9-46d1-90b9-bfe5fbbed654","Type":"ContainerStarted","Data":"44cdaab1fb73d6a4813d9a73b8ecf18a6396434fc0937e1f0a38e5ce91c41f3c"} Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.827075 4834 generic.go:334] "Generic (PLEG): container finished" podID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerID="5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068" exitCode=0 Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.827413 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktqqx" event={"ID":"e9cd17a8-2071-42d3-a704-744ff5e8b53a","Type":"ContainerDied","Data":"5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068"} Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.827467 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktqqx" event={"ID":"e9cd17a8-2071-42d3-a704-744ff5e8b53a","Type":"ContainerStarted","Data":"25f4ff4b4d5aab26347ee766e65b0717c16cf2b4bb68f5d4f3dc8fe370d8c325"} Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.830344 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.832415 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pss4j" event={"ID":"166b9d5c-74b2-4891-bbba-f4c2e81b683d","Type":"ContainerStarted","Data":"0d560d9b1d4f50910619fbdc45da91d394df6749968d075f29027a26c0859d78"} Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.834690 4834 generic.go:334] "Generic (PLEG): container finished" podID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerID="32ed09df8c774278ee303e749d111b7f344dc2a20d27555ca24c8f43947d88c3" exitCode=0 Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.834758 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsb7t" event={"ID":"df50fe7e-b188-43c2-bfeb-e82b35741ad1","Type":"ContainerDied","Data":"32ed09df8c774278ee303e749d111b7f344dc2a20d27555ca24c8f43947d88c3"} Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.845385 4834 generic.go:334] "Generic (PLEG): container finished" podID="88a98958-391d-4c4a-9624-b52c51638e12" containerID="9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096" exitCode=0 Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.846172 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdszs" event={"ID":"88a98958-391d-4c4a-9624-b52c51638e12","Type":"ContainerDied","Data":"9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096"} Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.857482 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pss4j" podStartSLOduration=9.857469778 podStartE2EDuration="9.857469778s" podCreationTimestamp="2025-11-26 12:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:48.855694038 +0000 UTC m=+126.762907390" watchObservedRunningTime="2025-11-26 12:13:48.857469778 +0000 UTC m=+126.764683130" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.898545 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.989450 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2n7"] Nov 26 12:13:48 crc kubenswrapper[4834]: I1126 12:13:48.990249 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.003071 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2n7"] Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.038500 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9cvdp"] Nov 26 12:13:49 crc kubenswrapper[4834]: W1126 12:13:49.053432 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5f5d03b_e71e_44d5_96c2_2dd188cd3712.slice/crio-cea4aff14949b69f6f79d71764e303255564f2c5511256ee62943b5bfceb4641 WatchSource:0}: Error finding container cea4aff14949b69f6f79d71764e303255564f2c5511256ee62943b5bfceb4641: Status 404 returned error can't find the container with id cea4aff14949b69f6f79d71764e303255564f2c5511256ee62943b5bfceb4641 Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.089945 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-utilities\") pod \"redhat-marketplace-wt2n7\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.090298 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-catalog-content\") pod \"redhat-marketplace-wt2n7\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.090608 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwqr5\" (UniqueName: \"kubernetes.io/projected/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-kube-api-access-pwqr5\") pod \"redhat-marketplace-wt2n7\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.138509 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr48n"] Nov 26 12:13:49 crc kubenswrapper[4834]: W1126 12:13:49.143604 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14803438_3861_4da4_a23f_798966cbb5e4.slice/crio-2afd59cf4240359c12ed385e762cfb55d3041b7353ff076c497181216639c6e9 WatchSource:0}: Error finding container 2afd59cf4240359c12ed385e762cfb55d3041b7353ff076c497181216639c6e9: Status 404 returned error can't find the container with id 2afd59cf4240359c12ed385e762cfb55d3041b7353ff076c497181216639c6e9 Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.192126 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-catalog-content\") pod \"redhat-marketplace-wt2n7\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.192229 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwqr5\" (UniqueName: \"kubernetes.io/projected/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-kube-api-access-pwqr5\") pod \"redhat-marketplace-wt2n7\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.192360 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-utilities\") pod \"redhat-marketplace-wt2n7\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.192822 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-utilities\") pod \"redhat-marketplace-wt2n7\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.193291 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-catalog-content\") pod \"redhat-marketplace-wt2n7\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.212436 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwqr5\" (UniqueName: \"kubernetes.io/projected/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-kube-api-access-pwqr5\") pod \"redhat-marketplace-wt2n7\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.314906 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.455475 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2n7"] Nov 26 12:13:49 crc kubenswrapper[4834]: W1126 12:13:49.462150 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49d03e80_ee1d_4afa_b773_cb3eb54ff2bb.slice/crio-b3fcb84f321f33e85c24e129f4f09e42b99963f199d82e860f8d90b650e96bf4 WatchSource:0}: Error finding container b3fcb84f321f33e85c24e129f4f09e42b99963f199d82e860f8d90b650e96bf4: Status 404 returned error can't find the container with id b3fcb84f321f33e85c24e129f4f09e42b99963f199d82e860f8d90b650e96bf4 Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.561799 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:49 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:49 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:49 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.561865 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.581526 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mqj69"] Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.582508 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.584505 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.594768 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqj69"] Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.700730 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmjfl\" (UniqueName: \"kubernetes.io/projected/43791ad6-52e3-4dff-8b3b-3098c3011e85-kube-api-access-vmjfl\") pod \"redhat-operators-mqj69\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.700771 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-catalog-content\") pod \"redhat-operators-mqj69\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.700815 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-utilities\") pod \"redhat-operators-mqj69\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.801667 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-catalog-content\") pod \"redhat-operators-mqj69\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.801743 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-utilities\") pod \"redhat-operators-mqj69\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.801833 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmjfl\" (UniqueName: \"kubernetes.io/projected/43791ad6-52e3-4dff-8b3b-3098c3011e85-kube-api-access-vmjfl\") pod \"redhat-operators-mqj69\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.802560 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-catalog-content\") pod \"redhat-operators-mqj69\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.802776 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-utilities\") pod \"redhat-operators-mqj69\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.832303 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmjfl\" (UniqueName: \"kubernetes.io/projected/43791ad6-52e3-4dff-8b3b-3098c3011e85-kube-api-access-vmjfl\") pod \"redhat-operators-mqj69\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.854448 4834 generic.go:334] "Generic (PLEG): container finished" podID="5834d4a5-27c9-46d1-90b9-bfe5fbbed654" containerID="2535900eed04ebbdad769b20c98b629ed935aa1ef12075feb47d0203479298fb" exitCode=0 Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.854507 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5834d4a5-27c9-46d1-90b9-bfe5fbbed654","Type":"ContainerDied","Data":"2535900eed04ebbdad769b20c98b629ed935aa1ef12075feb47d0203479298fb"} Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.857831 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" event={"ID":"c5f5d03b-e71e-44d5-96c2-2dd188cd3712","Type":"ContainerStarted","Data":"f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117"} Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.857952 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" event={"ID":"c5f5d03b-e71e-44d5-96c2-2dd188cd3712","Type":"ContainerStarted","Data":"cea4aff14949b69f6f79d71764e303255564f2c5511256ee62943b5bfceb4641"} Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.857995 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.860610 4834 generic.go:334] "Generic (PLEG): container finished" podID="14803438-3861-4da4-a23f-798966cbb5e4" containerID="7d0ce19d3bc307943fb0dcff53da8d566d320ca5886cd59e20df9427ff82197c" exitCode=0 Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.860795 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr48n" event={"ID":"14803438-3861-4da4-a23f-798966cbb5e4","Type":"ContainerDied","Data":"7d0ce19d3bc307943fb0dcff53da8d566d320ca5886cd59e20df9427ff82197c"} Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.860852 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr48n" event={"ID":"14803438-3861-4da4-a23f-798966cbb5e4","Type":"ContainerStarted","Data":"2afd59cf4240359c12ed385e762cfb55d3041b7353ff076c497181216639c6e9"} Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.864790 4834 generic.go:334] "Generic (PLEG): container finished" podID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerID="1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2" exitCode=0 Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.864832 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2n7" event={"ID":"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb","Type":"ContainerDied","Data":"1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2"} Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.864860 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2n7" event={"ID":"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb","Type":"ContainerStarted","Data":"b3fcb84f321f33e85c24e129f4f09e42b99963f199d82e860f8d90b650e96bf4"} Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.908235 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" podStartSLOduration=105.908211726 podStartE2EDuration="1m45.908211726s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:13:49.901660951 +0000 UTC m=+127.808874304" watchObservedRunningTime="2025-11-26 12:13:49.908211726 +0000 UTC m=+127.815425078" Nov 26 12:13:49 crc kubenswrapper[4834]: I1126 12:13:49.967515 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:49.996740 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mmkwr"] Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:49.997751 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.019847 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mmkwr"] Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.124678 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8ww\" (UniqueName: \"kubernetes.io/projected/dd50fc84-b004-431d-a78f-2fdf871c3c02-kube-api-access-px8ww\") pod \"redhat-operators-mmkwr\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.124924 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-utilities\") pod \"redhat-operators-mmkwr\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.124966 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-catalog-content\") pod \"redhat-operators-mmkwr\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.226298 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8ww\" (UniqueName: \"kubernetes.io/projected/dd50fc84-b004-431d-a78f-2fdf871c3c02-kube-api-access-px8ww\") pod \"redhat-operators-mmkwr\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.226753 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-utilities\") pod \"redhat-operators-mmkwr\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.226781 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-catalog-content\") pod \"redhat-operators-mmkwr\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.227438 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-catalog-content\") pod \"redhat-operators-mmkwr\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.227520 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-utilities\") pod \"redhat-operators-mmkwr\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.241699 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8ww\" (UniqueName: \"kubernetes.io/projected/dd50fc84-b004-431d-a78f-2fdf871c3c02-kube-api-access-px8ww\") pod \"redhat-operators-mmkwr\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.269172 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqj69"] Nov 26 12:13:50 crc kubenswrapper[4834]: W1126 12:13:50.282615 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43791ad6_52e3_4dff_8b3b_3098c3011e85.slice/crio-c9cd2f5c9f5b4ec320c7ffc83f68eccf07c07c66c759b82ca51226f0e587d1c6 WatchSource:0}: Error finding container c9cd2f5c9f5b4ec320c7ffc83f68eccf07c07c66c759b82ca51226f0e587d1c6: Status 404 returned error can't find the container with id c9cd2f5c9f5b4ec320c7ffc83f68eccf07c07c66c759b82ca51226f0e587d1c6 Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.379210 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.425186 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.561812 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:50 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:50 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:50 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.561878 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.799350 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mmkwr"] Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.877030 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmkwr" event={"ID":"dd50fc84-b004-431d-a78f-2fdf871c3c02","Type":"ContainerStarted","Data":"525bcbf5005f384a4e28c8962f1f81345e41400d011c99adfa75ab22b00618a3"} Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.882611 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqj69" event={"ID":"43791ad6-52e3-4dff-8b3b-3098c3011e85","Type":"ContainerDied","Data":"c7252708fed4ed3bb8dd6be93e30dc70fa08953dc47f7b41f9e430089337b23b"} Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.882506 4834 generic.go:334] "Generic (PLEG): container finished" podID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerID="c7252708fed4ed3bb8dd6be93e30dc70fa08953dc47f7b41f9e430089337b23b" exitCode=0 Nov 26 12:13:50 crc kubenswrapper[4834]: I1126 12:13:50.883057 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqj69" event={"ID":"43791ad6-52e3-4dff-8b3b-3098c3011e85","Type":"ContainerStarted","Data":"c9cd2f5c9f5b4ec320c7ffc83f68eccf07c07c66c759b82ca51226f0e587d1c6"} Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.081781 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.140270 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kubelet-dir\") pod \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\" (UID: \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\") " Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.140333 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kube-api-access\") pod \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\" (UID: \"5834d4a5-27c9-46d1-90b9-bfe5fbbed654\") " Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.140377 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5834d4a5-27c9-46d1-90b9-bfe5fbbed654" (UID: "5834d4a5-27c9-46d1-90b9-bfe5fbbed654"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.140707 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.148688 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5834d4a5-27c9-46d1-90b9-bfe5fbbed654" (UID: "5834d4a5-27c9-46d1-90b9-bfe5fbbed654"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.242109 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5834d4a5-27c9-46d1-90b9-bfe5fbbed654-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.561839 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:51 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:51 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:51 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.561903 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.890685 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5834d4a5-27c9-46d1-90b9-bfe5fbbed654","Type":"ContainerDied","Data":"44cdaab1fb73d6a4813d9a73b8ecf18a6396434fc0937e1f0a38e5ce91c41f3c"} Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.890722 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44cdaab1fb73d6a4813d9a73b8ecf18a6396434fc0937e1f0a38e5ce91c41f3c" Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.890768 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.895913 4834 generic.go:334] "Generic (PLEG): container finished" podID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerID="c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7" exitCode=0 Nov 26 12:13:51 crc kubenswrapper[4834]: I1126 12:13:51.895948 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmkwr" event={"ID":"dd50fc84-b004-431d-a78f-2fdf871c3c02","Type":"ContainerDied","Data":"c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7"} Nov 26 12:13:52 crc kubenswrapper[4834]: I1126 12:13:52.137286 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9rfmg" Nov 26 12:13:52 crc kubenswrapper[4834]: I1126 12:13:52.185321 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:52 crc kubenswrapper[4834]: I1126 12:13:52.185361 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:13:52 crc kubenswrapper[4834]: I1126 12:13:52.188063 4834 patch_prober.go:28] interesting pod/console-f9d7485db-z2dpf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Nov 26 12:13:52 crc kubenswrapper[4834]: I1126 12:13:52.188127 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-z2dpf" podUID="17225c8b-a6dd-4958-b1be-58b3cc4ad317" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Nov 26 12:13:52 crc kubenswrapper[4834]: I1126 12:13:52.280937 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:52 crc kubenswrapper[4834]: I1126 12:13:52.285190 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hvfh2" Nov 26 12:13:52 crc kubenswrapper[4834]: I1126 12:13:52.560669 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:52 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:52 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:52 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:52 crc kubenswrapper[4834]: I1126 12:13:52.560725 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.559470 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.562324 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:53 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:53 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:53 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.562627 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.710235 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 12:13:53 crc kubenswrapper[4834]: E1126 12:13:53.710804 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5834d4a5-27c9-46d1-90b9-bfe5fbbed654" containerName="pruner" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.710875 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5834d4a5-27c9-46d1-90b9-bfe5fbbed654" containerName="pruner" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.711043 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5834d4a5-27c9-46d1-90b9-bfe5fbbed654" containerName="pruner" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.711615 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.713577 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.713867 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.723107 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.776387 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.776438 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.877975 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.878032 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.878083 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:13:53 crc kubenswrapper[4834]: I1126 12:13:53.896449 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:13:54 crc kubenswrapper[4834]: I1126 12:13:54.037601 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:13:54 crc kubenswrapper[4834]: I1126 12:13:54.561618 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:54 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:54 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:54 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:54 crc kubenswrapper[4834]: I1126 12:13:54.561708 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:55 crc kubenswrapper[4834]: I1126 12:13:55.314322 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ztdb4" Nov 26 12:13:55 crc kubenswrapper[4834]: I1126 12:13:55.568209 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:55 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:55 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:55 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:55 crc kubenswrapper[4834]: I1126 12:13:55.568261 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:56 crc kubenswrapper[4834]: I1126 12:13:56.561527 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:56 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:56 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:56 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:56 crc kubenswrapper[4834]: I1126 12:13:56.561606 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:57 crc kubenswrapper[4834]: I1126 12:13:57.561149 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:57 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:57 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:57 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:57 crc kubenswrapper[4834]: I1126 12:13:57.561232 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:57 crc kubenswrapper[4834]: I1126 12:13:57.879678 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 12:13:57 crc kubenswrapper[4834]: W1126 12:13:57.894077 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod701ce91a_c4be_40cb_b6f5_80974c8a43a5.slice/crio-7bd9aa2c8647f3569c16a66dd2b14f8eedb94c2bd24da3e4508dd2683174c9b1 WatchSource:0}: Error finding container 7bd9aa2c8647f3569c16a66dd2b14f8eedb94c2bd24da3e4508dd2683174c9b1: Status 404 returned error can't find the container with id 7bd9aa2c8647f3569c16a66dd2b14f8eedb94c2bd24da3e4508dd2683174c9b1 Nov 26 12:13:57 crc kubenswrapper[4834]: I1126 12:13:57.959655 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"701ce91a-c4be-40cb-b6f5-80974c8a43a5","Type":"ContainerStarted","Data":"7bd9aa2c8647f3569c16a66dd2b14f8eedb94c2bd24da3e4508dd2683174c9b1"} Nov 26 12:13:58 crc kubenswrapper[4834]: I1126 12:13:58.561803 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:58 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:58 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:58 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:58 crc kubenswrapper[4834]: I1126 12:13:58.561874 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:13:58 crc kubenswrapper[4834]: I1126 12:13:58.971552 4834 generic.go:334] "Generic (PLEG): container finished" podID="701ce91a-c4be-40cb-b6f5-80974c8a43a5" containerID="c86f5d10d08efe467b4223421d112a9741ab7a4b0233dae9f1d7289fb52cc1fe" exitCode=0 Nov 26 12:13:58 crc kubenswrapper[4834]: I1126 12:13:58.971615 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"701ce91a-c4be-40cb-b6f5-80974c8a43a5","Type":"ContainerDied","Data":"c86f5d10d08efe467b4223421d112a9741ab7a4b0233dae9f1d7289fb52cc1fe"} Nov 26 12:13:59 crc kubenswrapper[4834]: I1126 12:13:59.128428 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:13:59 crc kubenswrapper[4834]: I1126 12:13:59.561999 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:13:59 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:13:59 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:13:59 crc kubenswrapper[4834]: healthz check failed Nov 26 12:13:59 crc kubenswrapper[4834]: I1126 12:13:59.562088 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:14:00 crc kubenswrapper[4834]: I1126 12:14:00.560998 4834 patch_prober.go:28] interesting pod/router-default-5444994796-r462f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 12:14:00 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Nov 26 12:14:00 crc kubenswrapper[4834]: [+]process-running ok Nov 26 12:14:00 crc kubenswrapper[4834]: healthz check failed Nov 26 12:14:00 crc kubenswrapper[4834]: I1126 12:14:00.561600 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r462f" podUID="4c7ceb0a-11e6-48ba-a34b-50cf6a7aa6f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 12:14:01 crc kubenswrapper[4834]: I1126 12:14:01.561576 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:14:01 crc kubenswrapper[4834]: I1126 12:14:01.564287 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r462f" Nov 26 12:14:02 crc kubenswrapper[4834]: I1126 12:14:02.185619 4834 patch_prober.go:28] interesting pod/console-f9d7485db-z2dpf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Nov 26 12:14:02 crc kubenswrapper[4834]: I1126 12:14:02.185963 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-z2dpf" podUID="17225c8b-a6dd-4958-b1be-58b3cc4ad317" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Nov 26 12:14:03 crc kubenswrapper[4834]: I1126 12:14:03.271894 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:14:03 crc kubenswrapper[4834]: I1126 12:14:03.341411 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kube-api-access\") pod \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\" (UID: \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\") " Nov 26 12:14:03 crc kubenswrapper[4834]: I1126 12:14:03.341451 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kubelet-dir\") pod \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\" (UID: \"701ce91a-c4be-40cb-b6f5-80974c8a43a5\") " Nov 26 12:14:03 crc kubenswrapper[4834]: I1126 12:14:03.341576 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "701ce91a-c4be-40cb-b6f5-80974c8a43a5" (UID: "701ce91a-c4be-40cb-b6f5-80974c8a43a5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:14:03 crc kubenswrapper[4834]: I1126 12:14:03.341703 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:03 crc kubenswrapper[4834]: I1126 12:14:03.348919 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "701ce91a-c4be-40cb-b6f5-80974c8a43a5" (UID: "701ce91a-c4be-40cb-b6f5-80974c8a43a5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:03 crc kubenswrapper[4834]: I1126 12:14:03.442464 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/701ce91a-c4be-40cb-b6f5-80974c8a43a5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:04 crc kubenswrapper[4834]: I1126 12:14:04.012298 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"701ce91a-c4be-40cb-b6f5-80974c8a43a5","Type":"ContainerDied","Data":"7bd9aa2c8647f3569c16a66dd2b14f8eedb94c2bd24da3e4508dd2683174c9b1"} Nov 26 12:14:04 crc kubenswrapper[4834]: I1126 12:14:04.012392 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bd9aa2c8647f3569c16a66dd2b14f8eedb94c2bd24da3e4508dd2683174c9b1" Nov 26 12:14:04 crc kubenswrapper[4834]: I1126 12:14:04.012353 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.523551 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.524016 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.524094 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.524169 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.525711 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.525724 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.525776 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.535714 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.535888 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.539102 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.547868 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.548487 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.629774 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.636266 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.728593 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:14:08 crc kubenswrapper[4834]: I1126 12:14:08.768170 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:14:10 crc kubenswrapper[4834]: I1126 12:14:10.048410 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr48n" event={"ID":"14803438-3861-4da4-a23f-798966cbb5e4","Type":"ContainerStarted","Data":"7bd8dea99bc330c365389d4a84c05fea1e2a5be3d32a244cd47a935f6ae853d0"} Nov 26 12:14:10 crc kubenswrapper[4834]: I1126 12:14:10.058529 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsb7t" event={"ID":"df50fe7e-b188-43c2-bfeb-e82b35741ad1","Type":"ContainerStarted","Data":"40d89df39d5041b71cd47bb6855a15561cf7c89392d0591f79f4b554d6b8954a"} Nov 26 12:14:10 crc kubenswrapper[4834]: W1126 12:14:10.341527 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-b7f0e722a8868b7d5970e262b13fbfaaa7537c1ab28c0753b87776283d271c24 WatchSource:0}: Error finding container b7f0e722a8868b7d5970e262b13fbfaaa7537c1ab28c0753b87776283d271c24: Status 404 returned error can't find the container with id b7f0e722a8868b7d5970e262b13fbfaaa7537c1ab28c0753b87776283d271c24 Nov 26 12:14:10 crc kubenswrapper[4834]: W1126 12:14:10.370641 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c843d77a46dfcd3481edc1038178a19af8016d93c41c633e3d1965d99e3bf73f WatchSource:0}: Error finding container c843d77a46dfcd3481edc1038178a19af8016d93c41c633e3d1965d99e3bf73f: Status 404 returned error can't find the container with id c843d77a46dfcd3481edc1038178a19af8016d93c41c633e3d1965d99e3bf73f Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.068116 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b72d07535b18beab1c7af283307835f9f2fb7fbb55fa6fd2c88adfb8283ca7ff"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.068508 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"65862d418509b37ec4b96c82c988e94eddd27a9f86643261bba3ac35649cd40c"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.069274 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dd2a8d6db7fe769d4f8ca2834e2aaf62e52c86a15475aaf245e9eda899576a94"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.069297 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c843d77a46dfcd3481edc1038178a19af8016d93c41c633e3d1965d99e3bf73f"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.069472 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.072177 4834 generic.go:334] "Generic (PLEG): container finished" podID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerID="9122400632aa0ac0c6650a1a84fc2f34a2b8827a0bee9d8f95be249fbd3385f2" exitCode=0 Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.072276 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqj69" event={"ID":"43791ad6-52e3-4dff-8b3b-3098c3011e85","Type":"ContainerDied","Data":"9122400632aa0ac0c6650a1a84fc2f34a2b8827a0bee9d8f95be249fbd3385f2"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.074161 4834 generic.go:334] "Generic (PLEG): container finished" podID="14803438-3861-4da4-a23f-798966cbb5e4" containerID="7bd8dea99bc330c365389d4a84c05fea1e2a5be3d32a244cd47a935f6ae853d0" exitCode=0 Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.074282 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr48n" event={"ID":"14803438-3861-4da4-a23f-798966cbb5e4","Type":"ContainerDied","Data":"7bd8dea99bc330c365389d4a84c05fea1e2a5be3d32a244cd47a935f6ae853d0"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.076665 4834 generic.go:334] "Generic (PLEG): container finished" podID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerID="e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601" exitCode=0 Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.076714 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2n7" event={"ID":"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb","Type":"ContainerDied","Data":"e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.077968 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a43ca361bd56bcbab74c0b36dd6633d044c584d7c3a640d2ccd0d0fee21e5192"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.078172 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b7f0e722a8868b7d5970e262b13fbfaaa7537c1ab28c0753b87776283d271c24"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.080618 4834 generic.go:334] "Generic (PLEG): container finished" podID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerID="a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4" exitCode=0 Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.080664 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktqqx" event={"ID":"e9cd17a8-2071-42d3-a704-744ff5e8b53a","Type":"ContainerDied","Data":"a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.087386 4834 generic.go:334] "Generic (PLEG): container finished" podID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerID="40d89df39d5041b71cd47bb6855a15561cf7c89392d0591f79f4b554d6b8954a" exitCode=0 Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.087442 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsb7t" event={"ID":"df50fe7e-b188-43c2-bfeb-e82b35741ad1","Type":"ContainerDied","Data":"40d89df39d5041b71cd47bb6855a15561cf7c89392d0591f79f4b554d6b8954a"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.092616 4834 generic.go:334] "Generic (PLEG): container finished" podID="88a98958-391d-4c4a-9624-b52c51638e12" containerID="0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c" exitCode=0 Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.092680 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdszs" event={"ID":"88a98958-391d-4c4a-9624-b52c51638e12","Type":"ContainerDied","Data":"0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.098632 4834 generic.go:334] "Generic (PLEG): container finished" podID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerID="578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f" exitCode=0 Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.098688 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssd6t" event={"ID":"498090a1-7333-4c7f-b48b-ae3ea0166165","Type":"ContainerDied","Data":"578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f"} Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.102676 4834 generic.go:334] "Generic (PLEG): container finished" podID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerID="b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2" exitCode=0 Nov 26 12:14:11 crc kubenswrapper[4834]: I1126 12:14:11.102703 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmkwr" event={"ID":"dd50fc84-b004-431d-a78f-2fdf871c3c02","Type":"ContainerDied","Data":"b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2"} Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.129489 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmkwr" event={"ID":"dd50fc84-b004-431d-a78f-2fdf871c3c02","Type":"ContainerStarted","Data":"d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351"} Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.132385 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2n7" event={"ID":"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb","Type":"ContainerStarted","Data":"ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208"} Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.137407 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr48n" event={"ID":"14803438-3861-4da4-a23f-798966cbb5e4","Type":"ContainerStarted","Data":"9705e29f4f5f142b8e1303940d6a1cbfa85fadf6a76d6af4225c1e99aa1f0271"} Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.142063 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsb7t" event={"ID":"df50fe7e-b188-43c2-bfeb-e82b35741ad1","Type":"ContainerStarted","Data":"e7b02f27acee5c74e7a2daa61db423e19e2d57e0938e16a1562cf2e399a99367"} Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.144113 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdszs" event={"ID":"88a98958-391d-4c4a-9624-b52c51638e12","Type":"ContainerStarted","Data":"f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f"} Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.146968 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mmkwr" podStartSLOduration=3.320389225 podStartE2EDuration="23.14695874s" podCreationTimestamp="2025-11-26 12:13:49 +0000 UTC" firstStartedPulling="2025-11-26 12:13:51.897975887 +0000 UTC m=+129.805189238" lastFinishedPulling="2025-11-26 12:14:11.724545401 +0000 UTC m=+149.631758753" observedRunningTime="2025-11-26 12:14:12.144462059 +0000 UTC m=+150.051675411" watchObservedRunningTime="2025-11-26 12:14:12.14695874 +0000 UTC m=+150.054172093" Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.148480 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqj69" event={"ID":"43791ad6-52e3-4dff-8b3b-3098c3011e85","Type":"ContainerStarted","Data":"84d9f5f513e8047d2abde8e0876bc7fb60bf707beb670c263aa4bd90535426c9"} Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.183672 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wr48n" podStartSLOduration=2.151962306 podStartE2EDuration="24.183665389s" podCreationTimestamp="2025-11-26 12:13:48 +0000 UTC" firstStartedPulling="2025-11-26 12:13:49.862709581 +0000 UTC m=+127.769922933" lastFinishedPulling="2025-11-26 12:14:11.894412663 +0000 UTC m=+149.801626016" observedRunningTime="2025-11-26 12:14:12.182952434 +0000 UTC m=+150.090165787" watchObservedRunningTime="2025-11-26 12:14:12.183665389 +0000 UTC m=+150.090878741" Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.183801 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qsb7t" podStartSLOduration=3.272220689 podStartE2EDuration="26.1837984s" podCreationTimestamp="2025-11-26 12:13:46 +0000 UTC" firstStartedPulling="2025-11-26 12:13:48.839382515 +0000 UTC m=+126.746595867" lastFinishedPulling="2025-11-26 12:14:11.750960226 +0000 UTC m=+149.658173578" observedRunningTime="2025-11-26 12:14:12.167721737 +0000 UTC m=+150.074935090" watchObservedRunningTime="2025-11-26 12:14:12.1837984 +0000 UTC m=+150.091011743" Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.196460 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.199769 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.208493 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jdszs" podStartSLOduration=3.467833001 podStartE2EDuration="26.208479392s" podCreationTimestamp="2025-11-26 12:13:46 +0000 UTC" firstStartedPulling="2025-11-26 12:13:48.847599312 +0000 UTC m=+126.754812664" lastFinishedPulling="2025-11-26 12:14:11.588245703 +0000 UTC m=+149.495459055" observedRunningTime="2025-11-26 12:14:12.208224421 +0000 UTC m=+150.115437764" watchObservedRunningTime="2025-11-26 12:14:12.208479392 +0000 UTC m=+150.115692744" Nov 26 12:14:12 crc kubenswrapper[4834]: I1126 12:14:12.228327 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wt2n7" podStartSLOduration=2.4671086190000002 podStartE2EDuration="24.228304159s" podCreationTimestamp="2025-11-26 12:13:48 +0000 UTC" firstStartedPulling="2025-11-26 12:13:49.867788301 +0000 UTC m=+127.775001653" lastFinishedPulling="2025-11-26 12:14:11.628983842 +0000 UTC m=+149.536197193" observedRunningTime="2025-11-26 12:14:12.226353277 +0000 UTC m=+150.133566629" watchObservedRunningTime="2025-11-26 12:14:12.228304159 +0000 UTC m=+150.135517512" Nov 26 12:14:13 crc kubenswrapper[4834]: I1126 12:14:13.155081 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktqqx" event={"ID":"e9cd17a8-2071-42d3-a704-744ff5e8b53a","Type":"ContainerStarted","Data":"b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c"} Nov 26 12:14:13 crc kubenswrapper[4834]: I1126 12:14:13.160353 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssd6t" event={"ID":"498090a1-7333-4c7f-b48b-ae3ea0166165","Type":"ContainerStarted","Data":"e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd"} Nov 26 12:14:13 crc kubenswrapper[4834]: I1126 12:14:13.174622 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mqj69" podStartSLOduration=3.427500371 podStartE2EDuration="24.17460136s" podCreationTimestamp="2025-11-26 12:13:49 +0000 UTC" firstStartedPulling="2025-11-26 12:13:50.886816563 +0000 UTC m=+128.794029915" lastFinishedPulling="2025-11-26 12:14:11.633917553 +0000 UTC m=+149.541130904" observedRunningTime="2025-11-26 12:14:12.27484031 +0000 UTC m=+150.182053662" watchObservedRunningTime="2025-11-26 12:14:13.17460136 +0000 UTC m=+151.081814713" Nov 26 12:14:13 crc kubenswrapper[4834]: I1126 12:14:13.175127 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ktqqx" podStartSLOduration=3.840152625 podStartE2EDuration="27.17512059s" podCreationTimestamp="2025-11-26 12:13:46 +0000 UTC" firstStartedPulling="2025-11-26 12:13:48.83089746 +0000 UTC m=+126.738110812" lastFinishedPulling="2025-11-26 12:14:12.165865425 +0000 UTC m=+150.073078777" observedRunningTime="2025-11-26 12:14:13.171907436 +0000 UTC m=+151.079120788" watchObservedRunningTime="2025-11-26 12:14:13.17512059 +0000 UTC m=+151.082333942" Nov 26 12:14:13 crc kubenswrapper[4834]: I1126 12:14:13.185956 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ssd6t" podStartSLOduration=3.996185916 podStartE2EDuration="27.18594999s" podCreationTimestamp="2025-11-26 12:13:46 +0000 UTC" firstStartedPulling="2025-11-26 12:13:48.830023367 +0000 UTC m=+126.737236719" lastFinishedPulling="2025-11-26 12:14:12.019787441 +0000 UTC m=+149.927000793" observedRunningTime="2025-11-26 12:14:13.184507356 +0000 UTC m=+151.091720709" watchObservedRunningTime="2025-11-26 12:14:13.18594999 +0000 UTC m=+151.093163342" Nov 26 12:14:16 crc kubenswrapper[4834]: I1126 12:14:16.676863 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rt9fj"] Nov 26 12:14:16 crc kubenswrapper[4834]: I1126 12:14:16.714616 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:14:16 crc kubenswrapper[4834]: I1126 12:14:16.714650 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:14:16 crc kubenswrapper[4834]: I1126 12:14:16.822789 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:14:16 crc kubenswrapper[4834]: I1126 12:14:16.923618 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:14:16 crc kubenswrapper[4834]: I1126 12:14:16.923707 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:14:16 crc kubenswrapper[4834]: I1126 12:14:16.951572 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:14:17 crc kubenswrapper[4834]: I1126 12:14:17.165115 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:14:17 crc kubenswrapper[4834]: I1126 12:14:17.165510 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:14:17 crc kubenswrapper[4834]: I1126 12:14:17.199471 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:14:17 crc kubenswrapper[4834]: I1126 12:14:17.213552 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:14:17 crc kubenswrapper[4834]: I1126 12:14:17.215408 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:14:17 crc kubenswrapper[4834]: I1126 12:14:17.226770 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:14:17 crc kubenswrapper[4834]: I1126 12:14:17.386637 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:14:17 crc kubenswrapper[4834]: I1126 12:14:17.386680 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:14:17 crc kubenswrapper[4834]: I1126 12:14:17.415993 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:14:18 crc kubenswrapper[4834]: I1126 12:14:18.223710 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:14:18 crc kubenswrapper[4834]: I1126 12:14:18.899413 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:14:18 crc kubenswrapper[4834]: I1126 12:14:18.899463 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:14:18 crc kubenswrapper[4834]: I1126 12:14:18.928243 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.223027 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.273942 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdszs"] Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.274959 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jdszs" podUID="88a98958-391d-4c4a-9624-b52c51638e12" containerName="registry-server" containerID="cri-o://f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f" gracePeriod=2 Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.316543 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.316596 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.347147 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.476986 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktqqx"] Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.590847 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.685410 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc9pd\" (UniqueName: \"kubernetes.io/projected/88a98958-391d-4c4a-9624-b52c51638e12-kube-api-access-gc9pd\") pod \"88a98958-391d-4c4a-9624-b52c51638e12\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.685649 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-utilities\") pod \"88a98958-391d-4c4a-9624-b52c51638e12\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.685811 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-catalog-content\") pod \"88a98958-391d-4c4a-9624-b52c51638e12\" (UID: \"88a98958-391d-4c4a-9624-b52c51638e12\") " Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.686341 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-utilities" (OuterVolumeSpecName: "utilities") pod "88a98958-391d-4c4a-9624-b52c51638e12" (UID: "88a98958-391d-4c4a-9624-b52c51638e12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.691421 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a98958-391d-4c4a-9624-b52c51638e12-kube-api-access-gc9pd" (OuterVolumeSpecName: "kube-api-access-gc9pd") pod "88a98958-391d-4c4a-9624-b52c51638e12" (UID: "88a98958-391d-4c4a-9624-b52c51638e12"). InnerVolumeSpecName "kube-api-access-gc9pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.720366 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88a98958-391d-4c4a-9624-b52c51638e12" (UID: "88a98958-391d-4c4a-9624-b52c51638e12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.787145 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.787174 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a98958-391d-4c4a-9624-b52c51638e12-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.787187 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc9pd\" (UniqueName: \"kubernetes.io/projected/88a98958-391d-4c4a-9624-b52c51638e12-kube-api-access-gc9pd\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.967942 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.968018 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:14:19 crc kubenswrapper[4834]: I1126 12:14:19.999224 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.196661 4834 generic.go:334] "Generic (PLEG): container finished" podID="88a98958-391d-4c4a-9624-b52c51638e12" containerID="f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f" exitCode=0 Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.196956 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ktqqx" podUID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerName="registry-server" containerID="cri-o://b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c" gracePeriod=2 Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.197170 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdszs" event={"ID":"88a98958-391d-4c4a-9624-b52c51638e12","Type":"ContainerDied","Data":"f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f"} Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.197201 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdszs" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.197224 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdszs" event={"ID":"88a98958-391d-4c4a-9624-b52c51638e12","Type":"ContainerDied","Data":"0380f92dee94bb46c86902979539c434f09c6bb23bfe213036c95e202d89ddb1"} Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.197248 4834 scope.go:117] "RemoveContainer" containerID="f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.227447 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdszs"] Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.229858 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jdszs"] Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.233936 4834 scope.go:117] "RemoveContainer" containerID="0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.236859 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.239886 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.258615 4834 scope.go:117] "RemoveContainer" containerID="9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.334272 4834 scope.go:117] "RemoveContainer" containerID="f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f" Nov 26 12:14:20 crc kubenswrapper[4834]: E1126 12:14:20.334859 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f\": container with ID starting with f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f not found: ID does not exist" containerID="f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.334906 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f"} err="failed to get container status \"f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f\": rpc error: code = NotFound desc = could not find container \"f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f\": container with ID starting with f98187ea929d659e73d163135051ce76136640c309054089044820c0e55e3f3f not found: ID does not exist" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.334958 4834 scope.go:117] "RemoveContainer" containerID="0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c" Nov 26 12:14:20 crc kubenswrapper[4834]: E1126 12:14:20.335326 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c\": container with ID starting with 0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c not found: ID does not exist" containerID="0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.335370 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c"} err="failed to get container status \"0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c\": rpc error: code = NotFound desc = could not find container \"0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c\": container with ID starting with 0dc1c7ee8c2f00b393f9efef107d0643fc8dacab2b41f4e658e4e3f4add2a14c not found: ID does not exist" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.335400 4834 scope.go:117] "RemoveContainer" containerID="9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096" Nov 26 12:14:20 crc kubenswrapper[4834]: E1126 12:14:20.335614 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096\": container with ID starting with 9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096 not found: ID does not exist" containerID="9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.335641 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096"} err="failed to get container status \"9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096\": rpc error: code = NotFound desc = could not find container \"9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096\": container with ID starting with 9b3b68679f859d84e054e3b9d60880f01d0331d93b8a8ccb81aecf295ee54096 not found: ID does not exist" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.382207 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.382253 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.415885 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.429114 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a98958-391d-4c4a-9624-b52c51638e12" path="/var/lib/kubelet/pods/88a98958-391d-4c4a-9624-b52c51638e12/volumes" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.510903 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.597122 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-utilities\") pod \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.597188 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-catalog-content\") pod \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.597335 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdflf\" (UniqueName: \"kubernetes.io/projected/e9cd17a8-2071-42d3-a704-744ff5e8b53a-kube-api-access-kdflf\") pod \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\" (UID: \"e9cd17a8-2071-42d3-a704-744ff5e8b53a\") " Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.597868 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-utilities" (OuterVolumeSpecName: "utilities") pod "e9cd17a8-2071-42d3-a704-744ff5e8b53a" (UID: "e9cd17a8-2071-42d3-a704-744ff5e8b53a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.601705 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9cd17a8-2071-42d3-a704-744ff5e8b53a-kube-api-access-kdflf" (OuterVolumeSpecName: "kube-api-access-kdflf") pod "e9cd17a8-2071-42d3-a704-744ff5e8b53a" (UID: "e9cd17a8-2071-42d3-a704-744ff5e8b53a"). InnerVolumeSpecName "kube-api-access-kdflf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.638962 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9cd17a8-2071-42d3-a704-744ff5e8b53a" (UID: "e9cd17a8-2071-42d3-a704-744ff5e8b53a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.698908 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.698928 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9cd17a8-2071-42d3-a704-744ff5e8b53a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:20 crc kubenswrapper[4834]: I1126 12:14:20.698940 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdflf\" (UniqueName: \"kubernetes.io/projected/e9cd17a8-2071-42d3-a704-744ff5e8b53a-kube-api-access-kdflf\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.204078 4834 generic.go:334] "Generic (PLEG): container finished" podID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerID="b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c" exitCode=0 Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.204120 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktqqx" event={"ID":"e9cd17a8-2071-42d3-a704-744ff5e8b53a","Type":"ContainerDied","Data":"b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c"} Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.204168 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktqqx" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.204189 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktqqx" event={"ID":"e9cd17a8-2071-42d3-a704-744ff5e8b53a","Type":"ContainerDied","Data":"25f4ff4b4d5aab26347ee766e65b0717c16cf2b4bb68f5d4f3dc8fe370d8c325"} Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.204213 4834 scope.go:117] "RemoveContainer" containerID="b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.220815 4834 scope.go:117] "RemoveContainer" containerID="a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.225228 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktqqx"] Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.227519 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ktqqx"] Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.241497 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.249230 4834 scope.go:117] "RemoveContainer" containerID="5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.261170 4834 scope.go:117] "RemoveContainer" containerID="b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c" Nov 26 12:14:21 crc kubenswrapper[4834]: E1126 12:14:21.261512 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c\": container with ID starting with b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c not found: ID does not exist" containerID="b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.261547 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c"} err="failed to get container status \"b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c\": rpc error: code = NotFound desc = could not find container \"b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c\": container with ID starting with b0e26105b661af1b113aea22d40b238b9c3fef016904c8de3d40c56000119c3c not found: ID does not exist" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.261573 4834 scope.go:117] "RemoveContainer" containerID="a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4" Nov 26 12:14:21 crc kubenswrapper[4834]: E1126 12:14:21.261859 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4\": container with ID starting with a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4 not found: ID does not exist" containerID="a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.261915 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4"} err="failed to get container status \"a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4\": rpc error: code = NotFound desc = could not find container \"a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4\": container with ID starting with a5b84f4204df7e7f380755990aa3b841d4d50fd1391b671dfaf5da9fe0bc09f4 not found: ID does not exist" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.261937 4834 scope.go:117] "RemoveContainer" containerID="5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068" Nov 26 12:14:21 crc kubenswrapper[4834]: E1126 12:14:21.262151 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068\": container with ID starting with 5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068 not found: ID does not exist" containerID="5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.262183 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068"} err="failed to get container status \"5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068\": rpc error: code = NotFound desc = could not find container \"5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068\": container with ID starting with 5363db089057001f4064fd587ef461a8ace25653896c3b07bac1ebf426470068 not found: ID does not exist" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.531824 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.531883 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:14:21 crc kubenswrapper[4834]: I1126 12:14:21.674574 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2n7"] Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.213250 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wt2n7" podUID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerName="registry-server" containerID="cri-o://ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208" gracePeriod=2 Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.424987 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" path="/var/lib/kubelet/pods/e9cd17a8-2071-42d3-a704-744ff5e8b53a/volumes" Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.539325 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.619270 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-catalog-content\") pod \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.619349 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwqr5\" (UniqueName: \"kubernetes.io/projected/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-kube-api-access-pwqr5\") pod \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.619399 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-utilities\") pod \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\" (UID: \"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb\") " Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.620262 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-utilities" (OuterVolumeSpecName: "utilities") pod "49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" (UID: "49d03e80-ee1d-4afa-b773-cb3eb54ff2bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.623978 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-kube-api-access-pwqr5" (OuterVolumeSpecName: "kube-api-access-pwqr5") pod "49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" (UID: "49d03e80-ee1d-4afa-b773-cb3eb54ff2bb"). InnerVolumeSpecName "kube-api-access-pwqr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.632958 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" (UID: "49d03e80-ee1d-4afa-b773-cb3eb54ff2bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.720883 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwqr5\" (UniqueName: \"kubernetes.io/projected/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-kube-api-access-pwqr5\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.720923 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.720935 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:22 crc kubenswrapper[4834]: I1126 12:14:22.884398 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nt8mw" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.219695 4834 generic.go:334] "Generic (PLEG): container finished" podID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerID="ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208" exitCode=0 Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.219775 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2n7" event={"ID":"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb","Type":"ContainerDied","Data":"ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208"} Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.219820 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wt2n7" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.219837 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2n7" event={"ID":"49d03e80-ee1d-4afa-b773-cb3eb54ff2bb","Type":"ContainerDied","Data":"b3fcb84f321f33e85c24e129f4f09e42b99963f199d82e860f8d90b650e96bf4"} Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.219864 4834 scope.go:117] "RemoveContainer" containerID="ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.246867 4834 scope.go:117] "RemoveContainer" containerID="e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.252836 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2n7"] Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.257444 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2n7"] Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.272843 4834 scope.go:117] "RemoveContainer" containerID="1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.283600 4834 scope.go:117] "RemoveContainer" containerID="ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208" Nov 26 12:14:23 crc kubenswrapper[4834]: E1126 12:14:23.283992 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208\": container with ID starting with ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208 not found: ID does not exist" containerID="ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.284028 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208"} err="failed to get container status \"ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208\": rpc error: code = NotFound desc = could not find container \"ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208\": container with ID starting with ac38c727018a747a079d52cbcdebd4d0437285de4d050a8a771c49d124ff5208 not found: ID does not exist" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.284050 4834 scope.go:117] "RemoveContainer" containerID="e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601" Nov 26 12:14:23 crc kubenswrapper[4834]: E1126 12:14:23.284342 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601\": container with ID starting with e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601 not found: ID does not exist" containerID="e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.284366 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601"} err="failed to get container status \"e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601\": rpc error: code = NotFound desc = could not find container \"e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601\": container with ID starting with e7e625686d6eee2b73850b5b9523fb6ac027cd690cbe9ac24798a4ad2c327601 not found: ID does not exist" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.284388 4834 scope.go:117] "RemoveContainer" containerID="1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2" Nov 26 12:14:23 crc kubenswrapper[4834]: E1126 12:14:23.284645 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2\": container with ID starting with 1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2 not found: ID does not exist" containerID="1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2" Nov 26 12:14:23 crc kubenswrapper[4834]: I1126 12:14:23.284667 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2"} err="failed to get container status \"1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2\": rpc error: code = NotFound desc = could not find container \"1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2\": container with ID starting with 1573c17ae27e326db1fdb983935a04ce08ec8b547ec04f4ae010922a5cfc60b2 not found: ID does not exist" Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.074547 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mmkwr"] Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.225239 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mmkwr" podUID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerName="registry-server" containerID="cri-o://d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351" gracePeriod=2 Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.426983 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" path="/var/lib/kubelet/pods/49d03e80-ee1d-4afa-b773-cb3eb54ff2bb/volumes" Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.533463 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.651989 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-utilities\") pod \"dd50fc84-b004-431d-a78f-2fdf871c3c02\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.652180 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px8ww\" (UniqueName: \"kubernetes.io/projected/dd50fc84-b004-431d-a78f-2fdf871c3c02-kube-api-access-px8ww\") pod \"dd50fc84-b004-431d-a78f-2fdf871c3c02\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.652802 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-utilities" (OuterVolumeSpecName: "utilities") pod "dd50fc84-b004-431d-a78f-2fdf871c3c02" (UID: "dd50fc84-b004-431d-a78f-2fdf871c3c02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.652901 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-catalog-content\") pod \"dd50fc84-b004-431d-a78f-2fdf871c3c02\" (UID: \"dd50fc84-b004-431d-a78f-2fdf871c3c02\") " Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.653475 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.657294 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd50fc84-b004-431d-a78f-2fdf871c3c02-kube-api-access-px8ww" (OuterVolumeSpecName: "kube-api-access-px8ww") pod "dd50fc84-b004-431d-a78f-2fdf871c3c02" (UID: "dd50fc84-b004-431d-a78f-2fdf871c3c02"). InnerVolumeSpecName "kube-api-access-px8ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.723419 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd50fc84-b004-431d-a78f-2fdf871c3c02" (UID: "dd50fc84-b004-431d-a78f-2fdf871c3c02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.754815 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px8ww\" (UniqueName: \"kubernetes.io/projected/dd50fc84-b004-431d-a78f-2fdf871c3c02-kube-api-access-px8ww\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:24 crc kubenswrapper[4834]: I1126 12:14:24.754845 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50fc84-b004-431d-a78f-2fdf871c3c02-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.072960 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vctqw"] Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.073211 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" podUID="94e98907-4981-47a2-b4ae-fa83a1a6e9ac" containerName="controller-manager" containerID="cri-o://0acb7e2723c49d198d919cac3d73caef8f19e1c375635dea13b0dbb0c365a575" gracePeriod=30 Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.168256 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw"] Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.168457 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" podUID="a5dda160-6a44-4f03-b9a4-9baeeae03a54" containerName="route-controller-manager" containerID="cri-o://188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27" gracePeriod=30 Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.231121 4834 generic.go:334] "Generic (PLEG): container finished" podID="94e98907-4981-47a2-b4ae-fa83a1a6e9ac" containerID="0acb7e2723c49d198d919cac3d73caef8f19e1c375635dea13b0dbb0c365a575" exitCode=0 Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.231205 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" event={"ID":"94e98907-4981-47a2-b4ae-fa83a1a6e9ac","Type":"ContainerDied","Data":"0acb7e2723c49d198d919cac3d73caef8f19e1c375635dea13b0dbb0c365a575"} Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.233662 4834 generic.go:334] "Generic (PLEG): container finished" podID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerID="d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351" exitCode=0 Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.233694 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmkwr" event={"ID":"dd50fc84-b004-431d-a78f-2fdf871c3c02","Type":"ContainerDied","Data":"d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351"} Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.233721 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmkwr" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.233735 4834 scope.go:117] "RemoveContainer" containerID="d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.233721 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmkwr" event={"ID":"dd50fc84-b004-431d-a78f-2fdf871c3c02","Type":"ContainerDied","Data":"525bcbf5005f384a4e28c8962f1f81345e41400d011c99adfa75ab22b00618a3"} Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.261906 4834 scope.go:117] "RemoveContainer" containerID="b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.265445 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mmkwr"] Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.269142 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mmkwr"] Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.295837 4834 scope.go:117] "RemoveContainer" containerID="c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.313167 4834 scope.go:117] "RemoveContainer" containerID="d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351" Nov 26 12:14:25 crc kubenswrapper[4834]: E1126 12:14:25.313576 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351\": container with ID starting with d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351 not found: ID does not exist" containerID="d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.313616 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351"} err="failed to get container status \"d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351\": rpc error: code = NotFound desc = could not find container \"d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351\": container with ID starting with d958abf4d0dd32576338ff358ae120a58f2b9258bfa33b54caaa534ddd814351 not found: ID does not exist" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.313643 4834 scope.go:117] "RemoveContainer" containerID="b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2" Nov 26 12:14:25 crc kubenswrapper[4834]: E1126 12:14:25.314265 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2\": container with ID starting with b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2 not found: ID does not exist" containerID="b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.314281 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2"} err="failed to get container status \"b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2\": rpc error: code = NotFound desc = could not find container \"b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2\": container with ID starting with b88a27a04f7deae43e83ed60d9b292c866083de43b70039a8d80e96a84bf33b2 not found: ID does not exist" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.314295 4834 scope.go:117] "RemoveContainer" containerID="c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7" Nov 26 12:14:25 crc kubenswrapper[4834]: E1126 12:14:25.314609 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7\": container with ID starting with c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7 not found: ID does not exist" containerID="c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.314642 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7"} err="failed to get container status \"c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7\": rpc error: code = NotFound desc = could not find container \"c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7\": container with ID starting with c4da011187e8a504fccbd33b2c38e94a4ca47a3375b3287818f95e39eea41ec7 not found: ID does not exist" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.390792 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.462552 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk6t7\" (UniqueName: \"kubernetes.io/projected/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-kube-api-access-kk6t7\") pod \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.462635 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-config\") pod \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.462720 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-proxy-ca-bundles\") pod \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.462751 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-serving-cert\") pod \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.462821 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-client-ca\") pod \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\" (UID: \"94e98907-4981-47a2-b4ae-fa83a1a6e9ac\") " Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.463429 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-client-ca" (OuterVolumeSpecName: "client-ca") pod "94e98907-4981-47a2-b4ae-fa83a1a6e9ac" (UID: "94e98907-4981-47a2-b4ae-fa83a1a6e9ac"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.463482 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "94e98907-4981-47a2-b4ae-fa83a1a6e9ac" (UID: "94e98907-4981-47a2-b4ae-fa83a1a6e9ac"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.463945 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-config" (OuterVolumeSpecName: "config") pod "94e98907-4981-47a2-b4ae-fa83a1a6e9ac" (UID: "94e98907-4981-47a2-b4ae-fa83a1a6e9ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.467428 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-kube-api-access-kk6t7" (OuterVolumeSpecName: "kube-api-access-kk6t7") pod "94e98907-4981-47a2-b4ae-fa83a1a6e9ac" (UID: "94e98907-4981-47a2-b4ae-fa83a1a6e9ac"). InnerVolumeSpecName "kube-api-access-kk6t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.467499 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "94e98907-4981-47a2-b4ae-fa83a1a6e9ac" (UID: "94e98907-4981-47a2-b4ae-fa83a1a6e9ac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.502717 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.563606 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-client-ca\") pod \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.563695 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56qxv\" (UniqueName: \"kubernetes.io/projected/a5dda160-6a44-4f03-b9a4-9baeeae03a54-kube-api-access-56qxv\") pod \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.563749 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5dda160-6a44-4f03-b9a4-9baeeae03a54-serving-cert\") pod \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.563813 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-config\") pod \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\" (UID: \"a5dda160-6a44-4f03-b9a4-9baeeae03a54\") " Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.564058 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.564075 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk6t7\" (UniqueName: \"kubernetes.io/projected/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-kube-api-access-kk6t7\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.564084 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.564091 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.564099 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e98907-4981-47a2-b4ae-fa83a1a6e9ac-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.564300 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-client-ca" (OuterVolumeSpecName: "client-ca") pod "a5dda160-6a44-4f03-b9a4-9baeeae03a54" (UID: "a5dda160-6a44-4f03-b9a4-9baeeae03a54"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.564525 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-config" (OuterVolumeSpecName: "config") pod "a5dda160-6a44-4f03-b9a4-9baeeae03a54" (UID: "a5dda160-6a44-4f03-b9a4-9baeeae03a54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.567896 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5dda160-6a44-4f03-b9a4-9baeeae03a54-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a5dda160-6a44-4f03-b9a4-9baeeae03a54" (UID: "a5dda160-6a44-4f03-b9a4-9baeeae03a54"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.567952 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5dda160-6a44-4f03-b9a4-9baeeae03a54-kube-api-access-56qxv" (OuterVolumeSpecName: "kube-api-access-56qxv") pod "a5dda160-6a44-4f03-b9a4-9baeeae03a54" (UID: "a5dda160-6a44-4f03-b9a4-9baeeae03a54"). InnerVolumeSpecName "kube-api-access-56qxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.665282 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.665327 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5dda160-6a44-4f03-b9a4-9baeeae03a54-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.665339 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56qxv\" (UniqueName: \"kubernetes.io/projected/a5dda160-6a44-4f03-b9a4-9baeeae03a54-kube-api-access-56qxv\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:25 crc kubenswrapper[4834]: I1126 12:14:25.665348 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5dda160-6a44-4f03-b9a4-9baeeae03a54-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.240459 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" event={"ID":"94e98907-4981-47a2-b4ae-fa83a1a6e9ac","Type":"ContainerDied","Data":"4188e325f7f1709c7308e8b5ce53c8c63d1b91517a8204e71d6be0d642a7c437"} Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.240538 4834 scope.go:117] "RemoveContainer" containerID="0acb7e2723c49d198d919cac3d73caef8f19e1c375635dea13b0dbb0c365a575" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.240480 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vctqw" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.242821 4834 generic.go:334] "Generic (PLEG): container finished" podID="a5dda160-6a44-4f03-b9a4-9baeeae03a54" containerID="188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27" exitCode=0 Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.242865 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" event={"ID":"a5dda160-6a44-4f03-b9a4-9baeeae03a54","Type":"ContainerDied","Data":"188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27"} Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.242890 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" event={"ID":"a5dda160-6a44-4f03-b9a4-9baeeae03a54","Type":"ContainerDied","Data":"abd6518868a0a3ff91ab132aba9cb598dffdb99f2ee95047ab166a0be07041fe"} Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.242891 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.254545 4834 scope.go:117] "RemoveContainer" containerID="188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.266907 4834 scope.go:117] "RemoveContainer" containerID="188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27" Nov 26 12:14:26 crc kubenswrapper[4834]: E1126 12:14:26.267461 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27\": container with ID starting with 188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27 not found: ID does not exist" containerID="188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.267504 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27"} err="failed to get container status \"188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27\": rpc error: code = NotFound desc = could not find container \"188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27\": container with ID starting with 188d0ab8db0e0296cacba7f338f0217acf1c047d14da71858f1ed13c9616ba27 not found: ID does not exist" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.267648 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw"] Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.272036 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28bw"] Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.272596 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.275047 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.280856 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vctqw"] Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.285203 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vctqw"] Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.287198 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6feada4f-ea0c-4062-ab87-ff88a4590c96-metrics-certs\") pod \"network-metrics-daemon-tmlsw\" (UID: \"6feada4f-ea0c-4062-ab87-ff88a4590c96\") " pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.423150 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94e98907-4981-47a2-b4ae-fa83a1a6e9ac" path="/var/lib/kubelet/pods/94e98907-4981-47a2-b4ae-fa83a1a6e9ac/volumes" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.423742 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5dda160-6a44-4f03-b9a4-9baeeae03a54" path="/var/lib/kubelet/pods/a5dda160-6a44-4f03-b9a4-9baeeae03a54/volumes" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.424424 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd50fc84-b004-431d-a78f-2fdf871c3c02" path="/var/lib/kubelet/pods/dd50fc84-b004-431d-a78f-2fdf871c3c02/volumes" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.435488 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.443517 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tmlsw" Nov 26 12:14:26 crc kubenswrapper[4834]: I1126 12:14:26.611137 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tmlsw"] Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.045260 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm"] Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.045834 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerName="extract-content" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.045894 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerName="extract-content" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.045910 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerName="extract-utilities" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.045916 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerName="extract-utilities" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.045926 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a98958-391d-4c4a-9624-b52c51638e12" containerName="extract-content" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046016 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a98958-391d-4c4a-9624-b52c51638e12" containerName="extract-content" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046024 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046032 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046041 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046047 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046187 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5dda160-6a44-4f03-b9a4-9baeeae03a54" containerName="route-controller-manager" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046201 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5dda160-6a44-4f03-b9a4-9baeeae03a54" containerName="route-controller-manager" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046231 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701ce91a-c4be-40cb-b6f5-80974c8a43a5" containerName="pruner" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046237 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="701ce91a-c4be-40cb-b6f5-80974c8a43a5" containerName="pruner" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046247 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerName="extract-utilities" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046253 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerName="extract-utilities" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046263 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerName="extract-content" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046451 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerName="extract-content" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046463 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a98958-391d-4c4a-9624-b52c51638e12" containerName="extract-utilities" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046502 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a98958-391d-4c4a-9624-b52c51638e12" containerName="extract-utilities" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046541 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerName="extract-content" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046624 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerName="extract-content" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046679 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046777 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046789 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerName="extract-utilities" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046894 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerName="extract-utilities" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.046911 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e98907-4981-47a2-b4ae-fa83a1a6e9ac" containerName="controller-manager" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.046971 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e98907-4981-47a2-b4ae-fa83a1a6e9ac" containerName="controller-manager" Nov 26 12:14:27 crc kubenswrapper[4834]: E1126 12:14:27.047015 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a98958-391d-4c4a-9624-b52c51638e12" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.047027 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a98958-391d-4c4a-9624-b52c51638e12" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.048046 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9cd17a8-2071-42d3-a704-744ff5e8b53a" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.048153 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5dda160-6a44-4f03-b9a4-9baeeae03a54" containerName="route-controller-manager" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.048164 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a98958-391d-4c4a-9624-b52c51638e12" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.048175 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d03e80-ee1d-4afa-b773-cb3eb54ff2bb" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.048281 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="701ce91a-c4be-40cb-b6f5-80974c8a43a5" containerName="pruner" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.048291 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd50fc84-b004-431d-a78f-2fdf871c3c02" containerName="registry-server" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.048298 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e98907-4981-47a2-b4ae-fa83a1a6e9ac" containerName="controller-manager" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.050078 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.053280 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z"] Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.053961 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.055455 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.055731 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.055859 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.056540 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.056665 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.056820 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.056896 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.058641 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.059022 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.059181 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.059199 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.060264 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm"] Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.063423 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.064490 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z"] Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.072251 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.182119 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwm5k\" (UniqueName: \"kubernetes.io/projected/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-kube-api-access-fwm5k\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.182215 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-config\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.182245 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-config\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.182267 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpd9d\" (UniqueName: \"kubernetes.io/projected/0599e8f4-3406-46f9-ae3b-b35f52339f64-kube-api-access-dpd9d\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.182428 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-proxy-ca-bundles\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.182490 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-client-ca\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.182520 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0599e8f4-3406-46f9-ae3b-b35f52339f64-serving-cert\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.182544 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-client-ca\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.182558 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-serving-cert\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.251326 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" event={"ID":"6feada4f-ea0c-4062-ab87-ff88a4590c96","Type":"ContainerStarted","Data":"84df3a521049233e637a6ec9141b19773fcf161e6546cc18d67d08e4c4daed9e"} Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.251416 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" event={"ID":"6feada4f-ea0c-4062-ab87-ff88a4590c96","Type":"ContainerStarted","Data":"d01c782b57fb36b793c531caded49e727cb406366b8cb80496898cbd6b740ce0"} Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.251435 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tmlsw" event={"ID":"6feada4f-ea0c-4062-ab87-ff88a4590c96","Type":"ContainerStarted","Data":"712ab0f6cdb5cf2bc765c7b689bed4ba8d34329307e61940967ee598e0794a3d"} Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.276450 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tmlsw" podStartSLOduration=143.27642993 podStartE2EDuration="2m23.27642993s" podCreationTimestamp="2025-11-26 12:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:14:27.273656716 +0000 UTC m=+165.180870078" watchObservedRunningTime="2025-11-26 12:14:27.27642993 +0000 UTC m=+165.183643281" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.284097 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-client-ca\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.284133 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0599e8f4-3406-46f9-ae3b-b35f52339f64-serving-cert\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.284154 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-client-ca\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.284170 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-serving-cert\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.284194 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwm5k\" (UniqueName: \"kubernetes.io/projected/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-kube-api-access-fwm5k\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.284228 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-config\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.284246 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-config\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.284263 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd9d\" (UniqueName: \"kubernetes.io/projected/0599e8f4-3406-46f9-ae3b-b35f52339f64-kube-api-access-dpd9d\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.284300 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-proxy-ca-bundles\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.285265 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-client-ca\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.285405 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-config\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.285492 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-config\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.285496 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-proxy-ca-bundles\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.286060 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-client-ca\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.292932 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0599e8f4-3406-46f9-ae3b-b35f52339f64-serving-cert\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.293007 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-serving-cert\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.302877 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwm5k\" (UniqueName: \"kubernetes.io/projected/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-kube-api-access-fwm5k\") pod \"controller-manager-5d76f5f7f7-jwv7z\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.305213 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpd9d\" (UniqueName: \"kubernetes.io/projected/0599e8f4-3406-46f9-ae3b-b35f52339f64-kube-api-access-dpd9d\") pod \"route-controller-manager-7764784f6c-c6btm\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.363652 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.375874 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.530516 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm"] Nov 26 12:14:27 crc kubenswrapper[4834]: W1126 12:14:27.547424 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0599e8f4_3406_46f9_ae3b_b35f52339f64.slice/crio-5bf0206fa145be23e2f63cccdc778f53342f4f524659160f21b472eb79385d70 WatchSource:0}: Error finding container 5bf0206fa145be23e2f63cccdc778f53342f4f524659160f21b472eb79385d70: Status 404 returned error can't find the container with id 5bf0206fa145be23e2f63cccdc778f53342f4f524659160f21b472eb79385d70 Nov 26 12:14:27 crc kubenswrapper[4834]: I1126 12:14:27.568782 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z"] Nov 26 12:14:27 crc kubenswrapper[4834]: W1126 12:14:27.575945 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7520706_3b4b_4a4b_99a0_64a65b0ccc06.slice/crio-67bde999b709a089fe893e42866bf76213e8b6fb1e2dbcebd04d2e3508919aec WatchSource:0}: Error finding container 67bde999b709a089fe893e42866bf76213e8b6fb1e2dbcebd04d2e3508919aec: Status 404 returned error can't find the container with id 67bde999b709a089fe893e42866bf76213e8b6fb1e2dbcebd04d2e3508919aec Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.260671 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" event={"ID":"c7520706-3b4b-4a4b-99a0-64a65b0ccc06","Type":"ContainerStarted","Data":"4859c5840a46a0f7d6f5b0a3760ced6165c87d0c97b8dc58f78e09a2fefc3050"} Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.260747 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.260768 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" event={"ID":"c7520706-3b4b-4a4b-99a0-64a65b0ccc06","Type":"ContainerStarted","Data":"67bde999b709a089fe893e42866bf76213e8b6fb1e2dbcebd04d2e3508919aec"} Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.262580 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" event={"ID":"0599e8f4-3406-46f9-ae3b-b35f52339f64","Type":"ContainerStarted","Data":"ffdc879852aa7c0f541582199117ee43c5db48fdc921e93db237d524b495ce28"} Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.262623 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" event={"ID":"0599e8f4-3406-46f9-ae3b-b35f52339f64","Type":"ContainerStarted","Data":"5bf0206fa145be23e2f63cccdc778f53342f4f524659160f21b472eb79385d70"} Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.262780 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.264485 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.269496 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.274891 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" podStartSLOduration=3.274880053 podStartE2EDuration="3.274880053s" podCreationTimestamp="2025-11-26 12:14:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:14:28.273292045 +0000 UTC m=+166.180505398" watchObservedRunningTime="2025-11-26 12:14:28.274880053 +0000 UTC m=+166.182093404" Nov 26 12:14:28 crc kubenswrapper[4834]: I1126 12:14:28.289922 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" podStartSLOduration=3.289903197 podStartE2EDuration="3.289903197s" podCreationTimestamp="2025-11-26 12:14:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:14:28.285760568 +0000 UTC m=+166.192973920" watchObservedRunningTime="2025-11-26 12:14:28.289903197 +0000 UTC m=+166.197116549" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.512886 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.514111 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.515688 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.515966 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.523336 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.657187 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.657249 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.758606 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.758686 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.758714 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.776530 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:32 crc kubenswrapper[4834]: I1126 12:14:32.828818 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:33 crc kubenswrapper[4834]: I1126 12:14:33.201732 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 12:14:33 crc kubenswrapper[4834]: I1126 12:14:33.302782 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dc11c74c-97ab-40eb-b635-9ba5b2a6b917","Type":"ContainerStarted","Data":"ad338a294d86ac388458a1cc55e07c29d66cd0d21b1263de716059d87da56587"} Nov 26 12:14:34 crc kubenswrapper[4834]: I1126 12:14:34.310431 4834 generic.go:334] "Generic (PLEG): container finished" podID="dc11c74c-97ab-40eb-b635-9ba5b2a6b917" containerID="01c43f52f1d39e18e9d20efbf8ceb3442ecb8494c36f5283c3688adb612c71d1" exitCode=0 Nov 26 12:14:34 crc kubenswrapper[4834]: I1126 12:14:34.310541 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dc11c74c-97ab-40eb-b635-9ba5b2a6b917","Type":"ContainerDied","Data":"01c43f52f1d39e18e9d20efbf8ceb3442ecb8494c36f5283c3688adb612c71d1"} Nov 26 12:14:35 crc kubenswrapper[4834]: I1126 12:14:35.550608 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:35 crc kubenswrapper[4834]: I1126 12:14:35.691504 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kubelet-dir\") pod \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\" (UID: \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\") " Nov 26 12:14:35 crc kubenswrapper[4834]: I1126 12:14:35.691606 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kube-api-access\") pod \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\" (UID: \"dc11c74c-97ab-40eb-b635-9ba5b2a6b917\") " Nov 26 12:14:35 crc kubenswrapper[4834]: I1126 12:14:35.691687 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dc11c74c-97ab-40eb-b635-9ba5b2a6b917" (UID: "dc11c74c-97ab-40eb-b635-9ba5b2a6b917"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:14:35 crc kubenswrapper[4834]: I1126 12:14:35.692125 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:35 crc kubenswrapper[4834]: I1126 12:14:35.699003 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dc11c74c-97ab-40eb-b635-9ba5b2a6b917" (UID: "dc11c74c-97ab-40eb-b635-9ba5b2a6b917"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:35 crc kubenswrapper[4834]: I1126 12:14:35.793345 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11c74c-97ab-40eb-b635-9ba5b2a6b917-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:36 crc kubenswrapper[4834]: I1126 12:14:36.325512 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dc11c74c-97ab-40eb-b635-9ba5b2a6b917","Type":"ContainerDied","Data":"ad338a294d86ac388458a1cc55e07c29d66cd0d21b1263de716059d87da56587"} Nov 26 12:14:36 crc kubenswrapper[4834]: I1126 12:14:36.325560 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad338a294d86ac388458a1cc55e07c29d66cd0d21b1263de716059d87da56587" Nov 26 12:14:36 crc kubenswrapper[4834]: I1126 12:14:36.325609 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.509984 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 12:14:38 crc kubenswrapper[4834]: E1126 12:14:38.510565 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc11c74c-97ab-40eb-b635-9ba5b2a6b917" containerName="pruner" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.510581 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc11c74c-97ab-40eb-b635-9ba5b2a6b917" containerName="pruner" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.510697 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc11c74c-97ab-40eb-b635-9ba5b2a6b917" containerName="pruner" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.511264 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.514683 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.514879 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.522106 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.634872 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-var-lock\") pod \"installer-9-crc\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.635036 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a38ab2-63cf-4044-8d03-1d84672f0f10-kube-api-access\") pod \"installer-9-crc\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.635082 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-kubelet-dir\") pod \"installer-9-crc\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.736502 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a38ab2-63cf-4044-8d03-1d84672f0f10-kube-api-access\") pod \"installer-9-crc\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.736557 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-kubelet-dir\") pod \"installer-9-crc\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.736644 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-var-lock\") pod \"installer-9-crc\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.736763 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-var-lock\") pod \"installer-9-crc\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.737142 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-kubelet-dir\") pod \"installer-9-crc\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.755364 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a38ab2-63cf-4044-8d03-1d84672f0f10-kube-api-access\") pod \"installer-9-crc\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:38 crc kubenswrapper[4834]: I1126 12:14:38.824966 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:14:39 crc kubenswrapper[4834]: I1126 12:14:39.198429 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 12:14:39 crc kubenswrapper[4834]: W1126 12:14:39.205616 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod54a38ab2_63cf_4044_8d03_1d84672f0f10.slice/crio-c45a4c0546bc485d1546dadd65ca14019bfc60712a6b64f322923c140ff9ec3b WatchSource:0}: Error finding container c45a4c0546bc485d1546dadd65ca14019bfc60712a6b64f322923c140ff9ec3b: Status 404 returned error can't find the container with id c45a4c0546bc485d1546dadd65ca14019bfc60712a6b64f322923c140ff9ec3b Nov 26 12:14:39 crc kubenswrapper[4834]: I1126 12:14:39.345135 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"54a38ab2-63cf-4044-8d03-1d84672f0f10","Type":"ContainerStarted","Data":"c45a4c0546bc485d1546dadd65ca14019bfc60712a6b64f322923c140ff9ec3b"} Nov 26 12:14:40 crc kubenswrapper[4834]: I1126 12:14:40.353699 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"54a38ab2-63cf-4044-8d03-1d84672f0f10","Type":"ContainerStarted","Data":"8af481aaa6b3abfa36657e06b426191a105468de9f5fae17a7487febcd2cd933"} Nov 26 12:14:40 crc kubenswrapper[4834]: I1126 12:14:40.370527 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.370500149 podStartE2EDuration="2.370500149s" podCreationTimestamp="2025-11-26 12:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:14:40.368376962 +0000 UTC m=+178.275590314" watchObservedRunningTime="2025-11-26 12:14:40.370500149 +0000 UTC m=+178.277713501" Nov 26 12:14:41 crc kubenswrapper[4834]: I1126 12:14:41.706634 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" podUID="c2fa5531-03be-4051-968e-a3b00820266e" containerName="oauth-openshift" containerID="cri-o://d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e" gracePeriod=15 Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.126179 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176014 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-login\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176092 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-idp-0-file-data\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176132 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-error\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176161 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f66f2\" (UniqueName: \"kubernetes.io/projected/c2fa5531-03be-4051-968e-a3b00820266e-kube-api-access-f66f2\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176184 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2fa5531-03be-4051-968e-a3b00820266e-audit-dir\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176222 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-cliconfig\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176248 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-router-certs\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176270 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-serving-cert\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176287 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-session\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176336 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-service-ca\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176366 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-audit-policies\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176397 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-provider-selection\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176415 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-trusted-ca-bundle\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176454 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-ocp-branding-template\") pod \"c2fa5531-03be-4051-968e-a3b00820266e\" (UID: \"c2fa5531-03be-4051-968e-a3b00820266e\") " Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176566 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2fa5531-03be-4051-968e-a3b00820266e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.176928 4834 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2fa5531-03be-4051-968e-a3b00820266e-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.177147 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.177196 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.177384 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.177581 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.182733 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.182933 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.183293 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fa5531-03be-4051-968e-a3b00820266e-kube-api-access-f66f2" (OuterVolumeSpecName: "kube-api-access-f66f2") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "kube-api-access-f66f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.183416 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.183606 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.183738 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.183865 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.184077 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.185084 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c2fa5531-03be-4051-968e-a3b00820266e" (UID: "c2fa5531-03be-4051-968e-a3b00820266e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.278838 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.278866 4834 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.278904 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.278919 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.278928 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.278941 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.278953 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.279054 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.279065 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f66f2\" (UniqueName: \"kubernetes.io/projected/c2fa5531-03be-4051-968e-a3b00820266e-kube-api-access-f66f2\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.279075 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.279084 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.279093 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.279102 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2fa5531-03be-4051-968e-a3b00820266e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.366611 4834 generic.go:334] "Generic (PLEG): container finished" podID="c2fa5531-03be-4051-968e-a3b00820266e" containerID="d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e" exitCode=0 Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.366665 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.366695 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" event={"ID":"c2fa5531-03be-4051-968e-a3b00820266e","Type":"ContainerDied","Data":"d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e"} Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.366798 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" event={"ID":"c2fa5531-03be-4051-968e-a3b00820266e","Type":"ContainerDied","Data":"1ad3342216c5db83b213cd72414a17694f4a43a7bec3765798c1560e8bef0845"} Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.366835 4834 scope.go:117] "RemoveContainer" containerID="d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.393587 4834 scope.go:117] "RemoveContainer" containerID="d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e" Nov 26 12:14:42 crc kubenswrapper[4834]: E1126 12:14:42.396685 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e\": container with ID starting with d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e not found: ID does not exist" containerID="d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.396779 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e"} err="failed to get container status \"d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e\": rpc error: code = NotFound desc = could not find container \"d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e\": container with ID starting with d88fe1b39205786112d071727cb4de4d671d334fe1f9446c3691473d6d2c881e not found: ID does not exist" Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.402062 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rt9fj"] Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.404851 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rt9fj"] Nov 26 12:14:42 crc kubenswrapper[4834]: I1126 12:14:42.423291 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fa5531-03be-4051-968e-a3b00820266e" path="/var/lib/kubelet/pods/c2fa5531-03be-4051-968e-a3b00820266e/volumes" Nov 26 12:14:43 crc kubenswrapper[4834]: I1126 12:14:43.082037 4834 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-rt9fj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 12:14:43 crc kubenswrapper[4834]: I1126 12:14:43.082121 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-rt9fj" podUID="c2fa5531-03be-4051-968e-a3b00820266e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.059302 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z"] Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.059849 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" podUID="c7520706-3b4b-4a4b-99a0-64a65b0ccc06" containerName="controller-manager" containerID="cri-o://4859c5840a46a0f7d6f5b0a3760ced6165c87d0c97b8dc58f78e09a2fefc3050" gracePeriod=30 Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.070685 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm"] Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.070904 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" podUID="0599e8f4-3406-46f9-ae3b-b35f52339f64" containerName="route-controller-manager" containerID="cri-o://ffdc879852aa7c0f541582199117ee43c5db48fdc921e93db237d524b495ce28" gracePeriod=30 Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.403721 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7520706-3b4b-4a4b-99a0-64a65b0ccc06" containerID="4859c5840a46a0f7d6f5b0a3760ced6165c87d0c97b8dc58f78e09a2fefc3050" exitCode=0 Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.404162 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" event={"ID":"c7520706-3b4b-4a4b-99a0-64a65b0ccc06","Type":"ContainerDied","Data":"4859c5840a46a0f7d6f5b0a3760ced6165c87d0c97b8dc58f78e09a2fefc3050"} Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.405926 4834 generic.go:334] "Generic (PLEG): container finished" podID="0599e8f4-3406-46f9-ae3b-b35f52339f64" containerID="ffdc879852aa7c0f541582199117ee43c5db48fdc921e93db237d524b495ce28" exitCode=0 Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.405955 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" event={"ID":"0599e8f4-3406-46f9-ae3b-b35f52339f64","Type":"ContainerDied","Data":"ffdc879852aa7c0f541582199117ee43c5db48fdc921e93db237d524b495ce28"} Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.508650 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.511670 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.621884 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-client-ca\") pod \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.621953 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwm5k\" (UniqueName: \"kubernetes.io/projected/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-kube-api-access-fwm5k\") pod \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.621978 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-config\") pod \"0599e8f4-3406-46f9-ae3b-b35f52339f64\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.622027 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-proxy-ca-bundles\") pod \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.622095 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0599e8f4-3406-46f9-ae3b-b35f52339f64-serving-cert\") pod \"0599e8f4-3406-46f9-ae3b-b35f52339f64\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.622174 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-client-ca\") pod \"0599e8f4-3406-46f9-ae3b-b35f52339f64\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.622228 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-config\") pod \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.622268 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpd9d\" (UniqueName: \"kubernetes.io/projected/0599e8f4-3406-46f9-ae3b-b35f52339f64-kube-api-access-dpd9d\") pod \"0599e8f4-3406-46f9-ae3b-b35f52339f64\" (UID: \"0599e8f4-3406-46f9-ae3b-b35f52339f64\") " Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.622368 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-serving-cert\") pod \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\" (UID: \"c7520706-3b4b-4a4b-99a0-64a65b0ccc06\") " Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.623059 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c7520706-3b4b-4a4b-99a0-64a65b0ccc06" (UID: "c7520706-3b4b-4a4b-99a0-64a65b0ccc06"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.623120 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-client-ca" (OuterVolumeSpecName: "client-ca") pod "c7520706-3b4b-4a4b-99a0-64a65b0ccc06" (UID: "c7520706-3b4b-4a4b-99a0-64a65b0ccc06"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.623233 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-client-ca" (OuterVolumeSpecName: "client-ca") pod "0599e8f4-3406-46f9-ae3b-b35f52339f64" (UID: "0599e8f4-3406-46f9-ae3b-b35f52339f64"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.623469 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-config" (OuterVolumeSpecName: "config") pod "c7520706-3b4b-4a4b-99a0-64a65b0ccc06" (UID: "c7520706-3b4b-4a4b-99a0-64a65b0ccc06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.623597 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-config" (OuterVolumeSpecName: "config") pod "0599e8f4-3406-46f9-ae3b-b35f52339f64" (UID: "0599e8f4-3406-46f9-ae3b-b35f52339f64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.629855 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0599e8f4-3406-46f9-ae3b-b35f52339f64-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0599e8f4-3406-46f9-ae3b-b35f52339f64" (UID: "0599e8f4-3406-46f9-ae3b-b35f52339f64"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.630164 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-kube-api-access-fwm5k" (OuterVolumeSpecName: "kube-api-access-fwm5k") pod "c7520706-3b4b-4a4b-99a0-64a65b0ccc06" (UID: "c7520706-3b4b-4a4b-99a0-64a65b0ccc06"). InnerVolumeSpecName "kube-api-access-fwm5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.631057 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0599e8f4-3406-46f9-ae3b-b35f52339f64-kube-api-access-dpd9d" (OuterVolumeSpecName: "kube-api-access-dpd9d") pod "0599e8f4-3406-46f9-ae3b-b35f52339f64" (UID: "0599e8f4-3406-46f9-ae3b-b35f52339f64"). InnerVolumeSpecName "kube-api-access-dpd9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.631603 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c7520706-3b4b-4a4b-99a0-64a65b0ccc06" (UID: "c7520706-3b4b-4a4b-99a0-64a65b0ccc06"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.723804 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.723862 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.723875 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwm5k\" (UniqueName: \"kubernetes.io/projected/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-kube-api-access-fwm5k\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.723889 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.723898 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.723906 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0599e8f4-3406-46f9-ae3b-b35f52339f64-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.723915 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0599e8f4-3406-46f9-ae3b-b35f52339f64-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.723924 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7520706-3b4b-4a4b-99a0-64a65b0ccc06-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:45 crc kubenswrapper[4834]: I1126 12:14:45.723933 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpd9d\" (UniqueName: \"kubernetes.io/projected/0599e8f4-3406-46f9-ae3b-b35f52339f64-kube-api-access-dpd9d\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.051836 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7995c9657b-fbnwh"] Nov 26 12:14:46 crc kubenswrapper[4834]: E1126 12:14:46.052073 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0599e8f4-3406-46f9-ae3b-b35f52339f64" containerName="route-controller-manager" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.052089 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0599e8f4-3406-46f9-ae3b-b35f52339f64" containerName="route-controller-manager" Nov 26 12:14:46 crc kubenswrapper[4834]: E1126 12:14:46.052097 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7520706-3b4b-4a4b-99a0-64a65b0ccc06" containerName="controller-manager" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.052103 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7520706-3b4b-4a4b-99a0-64a65b0ccc06" containerName="controller-manager" Nov 26 12:14:46 crc kubenswrapper[4834]: E1126 12:14:46.052116 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fa5531-03be-4051-968e-a3b00820266e" containerName="oauth-openshift" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.052123 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fa5531-03be-4051-968e-a3b00820266e" containerName="oauth-openshift" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.052216 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7520706-3b4b-4a4b-99a0-64a65b0ccc06" containerName="controller-manager" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.052227 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2fa5531-03be-4051-968e-a3b00820266e" containerName="oauth-openshift" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.052234 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0599e8f4-3406-46f9-ae3b-b35f52339f64" containerName="route-controller-manager" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.052628 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.055425 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.055935 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.058995 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.059006 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.059070 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.059094 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.059031 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.059223 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.059688 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.059955 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.060062 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.060328 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.073234 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7995c9657b-fbnwh"] Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.075016 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.080081 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.086042 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.129275 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-audit-policies\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.129499 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-template-login\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.129605 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28def30e-8dd5-40cd-9f54-9feecc4f7c48-audit-dir\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.129680 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.129749 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.129848 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.129932 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-template-error\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.130009 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzb49\" (UniqueName: \"kubernetes.io/projected/28def30e-8dd5-40cd-9f54-9feecc4f7c48-kube-api-access-pzb49\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.130099 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-service-ca\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.130177 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.130305 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.130367 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-router-certs\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.130393 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-session\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.130482 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232057 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232118 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-router-certs\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232145 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-session\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232164 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232194 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-audit-policies\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232236 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-template-login\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232296 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28def30e-8dd5-40cd-9f54-9feecc4f7c48-audit-dir\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232352 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232376 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232407 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.232428 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-template-error\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.233145 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28def30e-8dd5-40cd-9f54-9feecc4f7c48-audit-dir\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.233865 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzb49\" (UniqueName: \"kubernetes.io/projected/28def30e-8dd5-40cd-9f54-9feecc4f7c48-kube-api-access-pzb49\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.233955 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.234171 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-service-ca\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.234299 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.234790 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-service-ca\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.235136 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.235876 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28def30e-8dd5-40cd-9f54-9feecc4f7c48-audit-policies\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.238464 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-router-certs\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.238535 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-session\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.238550 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.238663 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-template-login\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.238844 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-template-error\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.240289 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.240478 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.241194 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28def30e-8dd5-40cd-9f54-9feecc4f7c48-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.250302 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzb49\" (UniqueName: \"kubernetes.io/projected/28def30e-8dd5-40cd-9f54-9feecc4f7c48-kube-api-access-pzb49\") pod \"oauth-openshift-7995c9657b-fbnwh\" (UID: \"28def30e-8dd5-40cd-9f54-9feecc4f7c48\") " pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.368604 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.413708 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" event={"ID":"c7520706-3b4b-4a4b-99a0-64a65b0ccc06","Type":"ContainerDied","Data":"67bde999b709a089fe893e42866bf76213e8b6fb1e2dbcebd04d2e3508919aec"} Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.413794 4834 scope.go:117] "RemoveContainer" containerID="4859c5840a46a0f7d6f5b0a3760ced6165c87d0c97b8dc58f78e09a2fefc3050" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.413792 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.416442 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.426300 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm" event={"ID":"0599e8f4-3406-46f9-ae3b-b35f52339f64","Type":"ContainerDied","Data":"5bf0206fa145be23e2f63cccdc778f53342f4f524659160f21b472eb79385d70"} Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.449455 4834 scope.go:117] "RemoveContainer" containerID="ffdc879852aa7c0f541582199117ee43c5db48fdc921e93db237d524b495ce28" Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.458068 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm"] Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.468668 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7764784f6c-c6btm"] Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.470360 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z"] Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.473052 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d76f5f7f7-jwv7z"] Nov 26 12:14:46 crc kubenswrapper[4834]: I1126 12:14:46.526680 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7995c9657b-fbnwh"] Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.055909 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb"] Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.057155 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.057878 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77885dd457-42jz2"] Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.058729 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.059637 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.060026 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.060167 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.060412 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.061676 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77885dd457-42jz2"] Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.064288 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.064458 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.064528 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.064559 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.064571 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.064729 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.064750 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.065037 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.066148 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb"] Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.069910 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.148335 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36fb9cee-29ce-42d6-97f0-06d1d44619fe-serving-cert\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.148398 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1da20727-891a-45f3-a22a-68180a244bf0-serving-cert\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.148446 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da20727-891a-45f3-a22a-68180a244bf0-config\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.148472 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/36fb9cee-29ce-42d6-97f0-06d1d44619fe-client-ca\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.148629 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66mqr\" (UniqueName: \"kubernetes.io/projected/36fb9cee-29ce-42d6-97f0-06d1d44619fe-kube-api-access-66mqr\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.148736 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcrrg\" (UniqueName: \"kubernetes.io/projected/1da20727-891a-45f3-a22a-68180a244bf0-kube-api-access-dcrrg\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.148782 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fb9cee-29ce-42d6-97f0-06d1d44619fe-config\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.148814 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1da20727-891a-45f3-a22a-68180a244bf0-client-ca\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.148918 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1da20727-891a-45f3-a22a-68180a244bf0-proxy-ca-bundles\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.250151 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da20727-891a-45f3-a22a-68180a244bf0-config\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.250227 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/36fb9cee-29ce-42d6-97f0-06d1d44619fe-client-ca\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.250281 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66mqr\" (UniqueName: \"kubernetes.io/projected/36fb9cee-29ce-42d6-97f0-06d1d44619fe-kube-api-access-66mqr\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.250362 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcrrg\" (UniqueName: \"kubernetes.io/projected/1da20727-891a-45f3-a22a-68180a244bf0-kube-api-access-dcrrg\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.250389 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fb9cee-29ce-42d6-97f0-06d1d44619fe-config\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.250421 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1da20727-891a-45f3-a22a-68180a244bf0-client-ca\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.250472 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1da20727-891a-45f3-a22a-68180a244bf0-proxy-ca-bundles\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.250500 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36fb9cee-29ce-42d6-97f0-06d1d44619fe-serving-cert\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.250526 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1da20727-891a-45f3-a22a-68180a244bf0-serving-cert\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.251553 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/36fb9cee-29ce-42d6-97f0-06d1d44619fe-client-ca\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.251878 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1da20727-891a-45f3-a22a-68180a244bf0-client-ca\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.252213 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da20727-891a-45f3-a22a-68180a244bf0-config\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.252337 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fb9cee-29ce-42d6-97f0-06d1d44619fe-config\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.252715 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1da20727-891a-45f3-a22a-68180a244bf0-proxy-ca-bundles\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.257448 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36fb9cee-29ce-42d6-97f0-06d1d44619fe-serving-cert\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.257903 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1da20727-891a-45f3-a22a-68180a244bf0-serving-cert\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.265179 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66mqr\" (UniqueName: \"kubernetes.io/projected/36fb9cee-29ce-42d6-97f0-06d1d44619fe-kube-api-access-66mqr\") pod \"route-controller-manager-7c96bd6859-zqzpb\" (UID: \"36fb9cee-29ce-42d6-97f0-06d1d44619fe\") " pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.267323 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcrrg\" (UniqueName: \"kubernetes.io/projected/1da20727-891a-45f3-a22a-68180a244bf0-kube-api-access-dcrrg\") pod \"controller-manager-77885dd457-42jz2\" (UID: \"1da20727-891a-45f3-a22a-68180a244bf0\") " pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.374170 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.381054 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.427134 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" event={"ID":"28def30e-8dd5-40cd-9f54-9feecc4f7c48","Type":"ContainerStarted","Data":"8eedf8ffb3177ab0495e777ff943d7c48bf7227326ce73804122bd650cef9c3c"} Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.427205 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" event={"ID":"28def30e-8dd5-40cd-9f54-9feecc4f7c48","Type":"ContainerStarted","Data":"542b17443293c65e1b27f6e585f4bb9353c86f29324100ac54fb535b3e8216c3"} Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.427453 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.436925 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.445147 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7995c9657b-fbnwh" podStartSLOduration=31.445135921 podStartE2EDuration="31.445135921s" podCreationTimestamp="2025-11-26 12:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:14:47.441993871 +0000 UTC m=+185.349207223" watchObservedRunningTime="2025-11-26 12:14:47.445135921 +0000 UTC m=+185.352349274" Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.631365 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77885dd457-42jz2"] Nov 26 12:14:47 crc kubenswrapper[4834]: I1126 12:14:47.653909 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb"] Nov 26 12:14:47 crc kubenswrapper[4834]: W1126 12:14:47.664854 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36fb9cee_29ce_42d6_97f0_06d1d44619fe.slice/crio-d49a32f9135de615c4f0e92af5a1f2b2ad47baa3d77fb17013360d84c3883f80 WatchSource:0}: Error finding container d49a32f9135de615c4f0e92af5a1f2b2ad47baa3d77fb17013360d84c3883f80: Status 404 returned error can't find the container with id d49a32f9135de615c4f0e92af5a1f2b2ad47baa3d77fb17013360d84c3883f80 Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.423601 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0599e8f4-3406-46f9-ae3b-b35f52339f64" path="/var/lib/kubelet/pods/0599e8f4-3406-46f9-ae3b-b35f52339f64/volumes" Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.424500 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7520706-3b4b-4a4b-99a0-64a65b0ccc06" path="/var/lib/kubelet/pods/c7520706-3b4b-4a4b-99a0-64a65b0ccc06/volumes" Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.439179 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" event={"ID":"36fb9cee-29ce-42d6-97f0-06d1d44619fe","Type":"ContainerStarted","Data":"7675f561795673955326e7b2c4f0daefea0640f3d166da8b957c23a4df7a86b1"} Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.439229 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" event={"ID":"36fb9cee-29ce-42d6-97f0-06d1d44619fe","Type":"ContainerStarted","Data":"d49a32f9135de615c4f0e92af5a1f2b2ad47baa3d77fb17013360d84c3883f80"} Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.439468 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.441139 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" event={"ID":"1da20727-891a-45f3-a22a-68180a244bf0","Type":"ContainerStarted","Data":"ad5b2a37e7d6bcc53aac5f7d785a6dd2a2f56753db8de40427adf590bed207bc"} Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.441186 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" event={"ID":"1da20727-891a-45f3-a22a-68180a244bf0","Type":"ContainerStarted","Data":"d97f84c491fd1ef8bffdade75872f079000a88c08172ef4ad9371e283e5f5f11"} Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.441384 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.444857 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.445767 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.476560 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c96bd6859-zqzpb" podStartSLOduration=3.476542164 podStartE2EDuration="3.476542164s" podCreationTimestamp="2025-11-26 12:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:14:48.458073627 +0000 UTC m=+186.365286980" watchObservedRunningTime="2025-11-26 12:14:48.476542164 +0000 UTC m=+186.383755517" Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.498005 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77885dd457-42jz2" podStartSLOduration=3.497972111 podStartE2EDuration="3.497972111s" podCreationTimestamp="2025-11-26 12:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:14:48.49417826 +0000 UTC m=+186.401391612" watchObservedRunningTime="2025-11-26 12:14:48.497972111 +0000 UTC m=+186.405185462" Nov 26 12:14:48 crc kubenswrapper[4834]: I1126 12:14:48.734519 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 12:14:51 crc kubenswrapper[4834]: I1126 12:14:51.531164 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:14:51 crc kubenswrapper[4834]: I1126 12:14:51.531497 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:14:57 crc kubenswrapper[4834]: I1126 12:14:57.980754 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ssd6t"] Nov 26 12:14:57 crc kubenswrapper[4834]: I1126 12:14:57.981676 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ssd6t" podUID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerName="registry-server" containerID="cri-o://e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd" gracePeriod=30 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.016239 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qsb7t"] Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.016610 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qsb7t" podUID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerName="registry-server" containerID="cri-o://e7b02f27acee5c74e7a2daa61db423e19e2d57e0938e16a1562cf2e399a99367" gracePeriod=30 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.035296 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcv95"] Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.037387 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" podUID="6c485dff-89c6-4d40-8ba4-3b69ac68820e" containerName="marketplace-operator" containerID="cri-o://5c3e3de614293cd0a4d2675ba6eeaba6170a045d2cdc48d410129553d7c79d37" gracePeriod=30 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.041683 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr48n"] Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.041997 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wr48n" podUID="14803438-3861-4da4-a23f-798966cbb5e4" containerName="registry-server" containerID="cri-o://9705e29f4f5f142b8e1303940d6a1cbfa85fadf6a76d6af4225c1e99aa1f0271" gracePeriod=30 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.044389 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqj69"] Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.044748 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mqj69" podUID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerName="registry-server" containerID="cri-o://84d9f5f513e8047d2abde8e0876bc7fb60bf707beb670c263aa4bd90535426c9" gracePeriod=30 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.045804 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gv52n"] Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.046619 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.050398 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gv52n"] Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.218703 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7e72b81-7b18-4afb-ad1e-818ff77aaf27-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gv52n\" (UID: \"c7e72b81-7b18-4afb-ad1e-818ff77aaf27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.218747 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr8gv\" (UniqueName: \"kubernetes.io/projected/c7e72b81-7b18-4afb-ad1e-818ff77aaf27-kube-api-access-lr8gv\") pod \"marketplace-operator-79b997595-gv52n\" (UID: \"c7e72b81-7b18-4afb-ad1e-818ff77aaf27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.218993 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7e72b81-7b18-4afb-ad1e-818ff77aaf27-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gv52n\" (UID: \"c7e72b81-7b18-4afb-ad1e-818ff77aaf27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.320592 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7e72b81-7b18-4afb-ad1e-818ff77aaf27-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gv52n\" (UID: \"c7e72b81-7b18-4afb-ad1e-818ff77aaf27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.320701 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7e72b81-7b18-4afb-ad1e-818ff77aaf27-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gv52n\" (UID: \"c7e72b81-7b18-4afb-ad1e-818ff77aaf27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.320758 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr8gv\" (UniqueName: \"kubernetes.io/projected/c7e72b81-7b18-4afb-ad1e-818ff77aaf27-kube-api-access-lr8gv\") pod \"marketplace-operator-79b997595-gv52n\" (UID: \"c7e72b81-7b18-4afb-ad1e-818ff77aaf27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.322097 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7e72b81-7b18-4afb-ad1e-818ff77aaf27-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gv52n\" (UID: \"c7e72b81-7b18-4afb-ad1e-818ff77aaf27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.330736 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7e72b81-7b18-4afb-ad1e-818ff77aaf27-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gv52n\" (UID: \"c7e72b81-7b18-4afb-ad1e-818ff77aaf27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.344453 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr8gv\" (UniqueName: \"kubernetes.io/projected/c7e72b81-7b18-4afb-ad1e-818ff77aaf27-kube-api-access-lr8gv\") pod \"marketplace-operator-79b997595-gv52n\" (UID: \"c7e72b81-7b18-4afb-ad1e-818ff77aaf27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.392972 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.401412 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.505965 4834 generic.go:334] "Generic (PLEG): container finished" podID="6c485dff-89c6-4d40-8ba4-3b69ac68820e" containerID="5c3e3de614293cd0a4d2675ba6eeaba6170a045d2cdc48d410129553d7c79d37" exitCode=0 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.506050 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" event={"ID":"6c485dff-89c6-4d40-8ba4-3b69ac68820e","Type":"ContainerDied","Data":"5c3e3de614293cd0a4d2675ba6eeaba6170a045d2cdc48d410129553d7c79d37"} Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.508104 4834 generic.go:334] "Generic (PLEG): container finished" podID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerID="e7b02f27acee5c74e7a2daa61db423e19e2d57e0938e16a1562cf2e399a99367" exitCode=0 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.508150 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsb7t" event={"ID":"df50fe7e-b188-43c2-bfeb-e82b35741ad1","Type":"ContainerDied","Data":"e7b02f27acee5c74e7a2daa61db423e19e2d57e0938e16a1562cf2e399a99367"} Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.510819 4834 generic.go:334] "Generic (PLEG): container finished" podID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerID="e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd" exitCode=0 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.510881 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssd6t" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.510881 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssd6t" event={"ID":"498090a1-7333-4c7f-b48b-ae3ea0166165","Type":"ContainerDied","Data":"e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd"} Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.510992 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssd6t" event={"ID":"498090a1-7333-4c7f-b48b-ae3ea0166165","Type":"ContainerDied","Data":"e11e08d133a8439dc89c67f61e0c1e8c56658f03842abd08f799390e8252ad27"} Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.511013 4834 scope.go:117] "RemoveContainer" containerID="e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.513188 4834 generic.go:334] "Generic (PLEG): container finished" podID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerID="84d9f5f513e8047d2abde8e0876bc7fb60bf707beb670c263aa4bd90535426c9" exitCode=0 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.513241 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqj69" event={"ID":"43791ad6-52e3-4dff-8b3b-3098c3011e85","Type":"ContainerDied","Data":"84d9f5f513e8047d2abde8e0876bc7fb60bf707beb670c263aa4bd90535426c9"} Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.520403 4834 generic.go:334] "Generic (PLEG): container finished" podID="14803438-3861-4da4-a23f-798966cbb5e4" containerID="9705e29f4f5f142b8e1303940d6a1cbfa85fadf6a76d6af4225c1e99aa1f0271" exitCode=0 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.520432 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr48n" event={"ID":"14803438-3861-4da4-a23f-798966cbb5e4","Type":"ContainerDied","Data":"9705e29f4f5f142b8e1303940d6a1cbfa85fadf6a76d6af4225c1e99aa1f0271"} Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.534339 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-utilities\") pod \"498090a1-7333-4c7f-b48b-ae3ea0166165\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.534538 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mr2h\" (UniqueName: \"kubernetes.io/projected/498090a1-7333-4c7f-b48b-ae3ea0166165-kube-api-access-8mr2h\") pod \"498090a1-7333-4c7f-b48b-ae3ea0166165\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.534620 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-catalog-content\") pod \"498090a1-7333-4c7f-b48b-ae3ea0166165\" (UID: \"498090a1-7333-4c7f-b48b-ae3ea0166165\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.535088 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-utilities" (OuterVolumeSpecName: "utilities") pod "498090a1-7333-4c7f-b48b-ae3ea0166165" (UID: "498090a1-7333-4c7f-b48b-ae3ea0166165"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.541590 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.545083 4834 scope.go:117] "RemoveContainer" containerID="578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.545641 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/498090a1-7333-4c7f-b48b-ae3ea0166165-kube-api-access-8mr2h" (OuterVolumeSpecName: "kube-api-access-8mr2h") pod "498090a1-7333-4c7f-b48b-ae3ea0166165" (UID: "498090a1-7333-4c7f-b48b-ae3ea0166165"). InnerVolumeSpecName "kube-api-access-8mr2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.551587 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mr2h\" (UniqueName: \"kubernetes.io/projected/498090a1-7333-4c7f-b48b-ae3ea0166165-kube-api-access-8mr2h\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.551607 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.575409 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.586752 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "498090a1-7333-4c7f-b48b-ae3ea0166165" (UID: "498090a1-7333-4c7f-b48b-ae3ea0166165"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.588036 4834 scope.go:117] "RemoveContainer" containerID="f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.588223 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.610050 4834 scope.go:117] "RemoveContainer" containerID="e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd" Nov 26 12:14:58 crc kubenswrapper[4834]: E1126 12:14:58.610793 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd\": container with ID starting with e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd not found: ID does not exist" containerID="e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.610841 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd"} err="failed to get container status \"e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd\": rpc error: code = NotFound desc = could not find container \"e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd\": container with ID starting with e9c74a1f1c4b3f0a6035adc528bec5cfd01b8a24d3f9f4e8aa08dce67a3e4abd not found: ID does not exist" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.610866 4834 scope.go:117] "RemoveContainer" containerID="578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f" Nov 26 12:14:58 crc kubenswrapper[4834]: E1126 12:14:58.611104 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f\": container with ID starting with 578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f not found: ID does not exist" containerID="578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.611500 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f"} err="failed to get container status \"578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f\": rpc error: code = NotFound desc = could not find container \"578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f\": container with ID starting with 578d3938150023bef4062064605b1c9b1d94e4bbb6f75538524f6ad4702e080f not found: ID does not exist" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.611518 4834 scope.go:117] "RemoveContainer" containerID="f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65" Nov 26 12:14:58 crc kubenswrapper[4834]: E1126 12:14:58.612096 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65\": container with ID starting with f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65 not found: ID does not exist" containerID="f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.612148 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65"} err="failed to get container status \"f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65\": rpc error: code = NotFound desc = could not find container \"f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65\": container with ID starting with f09d8bde4a444f56e37c5196f7e0cb1423983e88866ecdac384fdf607df35f65 not found: ID does not exist" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.652838 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-utilities\") pod \"14803438-3861-4da4-a23f-798966cbb5e4\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.652928 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-catalog-content\") pod \"14803438-3861-4da4-a23f-798966cbb5e4\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.653012 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p5mb\" (UniqueName: \"kubernetes.io/projected/14803438-3861-4da4-a23f-798966cbb5e4-kube-api-access-9p5mb\") pod \"14803438-3861-4da4-a23f-798966cbb5e4\" (UID: \"14803438-3861-4da4-a23f-798966cbb5e4\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.653119 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-catalog-content\") pod \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.653426 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/498090a1-7333-4c7f-b48b-ae3ea0166165-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.655144 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-utilities" (OuterVolumeSpecName: "utilities") pod "14803438-3861-4da4-a23f-798966cbb5e4" (UID: "14803438-3861-4da4-a23f-798966cbb5e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.657581 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14803438-3861-4da4-a23f-798966cbb5e4-kube-api-access-9p5mb" (OuterVolumeSpecName: "kube-api-access-9p5mb") pod "14803438-3861-4da4-a23f-798966cbb5e4" (UID: "14803438-3861-4da4-a23f-798966cbb5e4"). InnerVolumeSpecName "kube-api-access-9p5mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.658966 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.670094 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14803438-3861-4da4-a23f-798966cbb5e4" (UID: "14803438-3861-4da4-a23f-798966cbb5e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.702283 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df50fe7e-b188-43c2-bfeb-e82b35741ad1" (UID: "df50fe7e-b188-43c2-bfeb-e82b35741ad1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.754723 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-operator-metrics\") pod \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.754788 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-trusted-ca\") pod \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.754849 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgsjw\" (UniqueName: \"kubernetes.io/projected/df50fe7e-b188-43c2-bfeb-e82b35741ad1-kube-api-access-kgsjw\") pod \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.754922 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-utilities\") pod \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\" (UID: \"df50fe7e-b188-43c2-bfeb-e82b35741ad1\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.754979 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rds48\" (UniqueName: \"kubernetes.io/projected/6c485dff-89c6-4d40-8ba4-3b69ac68820e-kube-api-access-rds48\") pod \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\" (UID: \"6c485dff-89c6-4d40-8ba4-3b69ac68820e\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.755519 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.755534 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p5mb\" (UniqueName: \"kubernetes.io/projected/14803438-3861-4da4-a23f-798966cbb5e4-kube-api-access-9p5mb\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.755546 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.755555 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14803438-3861-4da4-a23f-798966cbb5e4-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.755668 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6c485dff-89c6-4d40-8ba4-3b69ac68820e" (UID: "6c485dff-89c6-4d40-8ba4-3b69ac68820e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.755906 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-utilities" (OuterVolumeSpecName: "utilities") pod "df50fe7e-b188-43c2-bfeb-e82b35741ad1" (UID: "df50fe7e-b188-43c2-bfeb-e82b35741ad1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.757941 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6c485dff-89c6-4d40-8ba4-3b69ac68820e" (UID: "6c485dff-89c6-4d40-8ba4-3b69ac68820e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.758327 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df50fe7e-b188-43c2-bfeb-e82b35741ad1-kube-api-access-kgsjw" (OuterVolumeSpecName: "kube-api-access-kgsjw") pod "df50fe7e-b188-43c2-bfeb-e82b35741ad1" (UID: "df50fe7e-b188-43c2-bfeb-e82b35741ad1"). InnerVolumeSpecName "kube-api-access-kgsjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.758836 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c485dff-89c6-4d40-8ba4-3b69ac68820e-kube-api-access-rds48" (OuterVolumeSpecName: "kube-api-access-rds48") pod "6c485dff-89c6-4d40-8ba4-3b69ac68820e" (UID: "6c485dff-89c6-4d40-8ba4-3b69ac68820e"). InnerVolumeSpecName "kube-api-access-rds48". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.839013 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ssd6t"] Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.846892 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ssd6t"] Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.856275 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmjfl\" (UniqueName: \"kubernetes.io/projected/43791ad6-52e3-4dff-8b3b-3098c3011e85-kube-api-access-vmjfl\") pod \"43791ad6-52e3-4dff-8b3b-3098c3011e85\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.856458 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-utilities\") pod \"43791ad6-52e3-4dff-8b3b-3098c3011e85\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.856567 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-catalog-content\") pod \"43791ad6-52e3-4dff-8b3b-3098c3011e85\" (UID: \"43791ad6-52e3-4dff-8b3b-3098c3011e85\") " Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.856837 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgsjw\" (UniqueName: \"kubernetes.io/projected/df50fe7e-b188-43c2-bfeb-e82b35741ad1-kube-api-access-kgsjw\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.856855 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df50fe7e-b188-43c2-bfeb-e82b35741ad1-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.856867 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rds48\" (UniqueName: \"kubernetes.io/projected/6c485dff-89c6-4d40-8ba4-3b69ac68820e-kube-api-access-rds48\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.856877 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.856886 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c485dff-89c6-4d40-8ba4-3b69ac68820e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.856987 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-utilities" (OuterVolumeSpecName: "utilities") pod "43791ad6-52e3-4dff-8b3b-3098c3011e85" (UID: "43791ad6-52e3-4dff-8b3b-3098c3011e85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.859724 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43791ad6-52e3-4dff-8b3b-3098c3011e85-kube-api-access-vmjfl" (OuterVolumeSpecName: "kube-api-access-vmjfl") pod "43791ad6-52e3-4dff-8b3b-3098c3011e85" (UID: "43791ad6-52e3-4dff-8b3b-3098c3011e85"). InnerVolumeSpecName "kube-api-access-vmjfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.859953 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gv52n"] Nov 26 12:14:58 crc kubenswrapper[4834]: W1126 12:14:58.864844 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7e72b81_7b18_4afb_ad1e_818ff77aaf27.slice/crio-88e09891969101c213bf9c075ea5b56e6dca6698c2e1b5eabbcecdbf637ea374 WatchSource:0}: Error finding container 88e09891969101c213bf9c075ea5b56e6dca6698c2e1b5eabbcecdbf637ea374: Status 404 returned error can't find the container with id 88e09891969101c213bf9c075ea5b56e6dca6698c2e1b5eabbcecdbf637ea374 Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.927093 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43791ad6-52e3-4dff-8b3b-3098c3011e85" (UID: "43791ad6-52e3-4dff-8b3b-3098c3011e85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.958587 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.958615 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43791ad6-52e3-4dff-8b3b-3098c3011e85-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:58 crc kubenswrapper[4834]: I1126 12:14:58.958630 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmjfl\" (UniqueName: \"kubernetes.io/projected/43791ad6-52e3-4dff-8b3b-3098c3011e85-kube-api-access-vmjfl\") on node \"crc\" DevicePath \"\"" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.529972 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr48n" event={"ID":"14803438-3861-4da4-a23f-798966cbb5e4","Type":"ContainerDied","Data":"2afd59cf4240359c12ed385e762cfb55d3041b7353ff076c497181216639c6e9"} Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.530520 4834 scope.go:117] "RemoveContainer" containerID="9705e29f4f5f142b8e1303940d6a1cbfa85fadf6a76d6af4225c1e99aa1f0271" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.530014 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wr48n" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.532130 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" event={"ID":"6c485dff-89c6-4d40-8ba4-3b69ac68820e","Type":"ContainerDied","Data":"03e8ad3e53abeae126470c6feb9580c98d7fde733b4cde7511cb6ca2afd18778"} Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.532245 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pcv95" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.537623 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsb7t" event={"ID":"df50fe7e-b188-43c2-bfeb-e82b35741ad1","Type":"ContainerDied","Data":"fe1b6a8f11e7ee2374140471a2e03a8a08c682f18298415d6ce56637e45724d2"} Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.537806 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qsb7t" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.544657 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqj69" event={"ID":"43791ad6-52e3-4dff-8b3b-3098c3011e85","Type":"ContainerDied","Data":"c9cd2f5c9f5b4ec320c7ffc83f68eccf07c07c66c759b82ca51226f0e587d1c6"} Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.545030 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqj69" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.546778 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" event={"ID":"c7e72b81-7b18-4afb-ad1e-818ff77aaf27","Type":"ContainerStarted","Data":"34fc5b22847d69c8cacb1d5c1c6803881703c2e804d46d7e59da69db0625adcf"} Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.546834 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" event={"ID":"c7e72b81-7b18-4afb-ad1e-818ff77aaf27","Type":"ContainerStarted","Data":"88e09891969101c213bf9c075ea5b56e6dca6698c2e1b5eabbcecdbf637ea374"} Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.547362 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.549200 4834 scope.go:117] "RemoveContainer" containerID="7bd8dea99bc330c365389d4a84c05fea1e2a5be3d32a244cd47a935f6ae853d0" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.552608 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.566854 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gv52n" podStartSLOduration=1.5668406990000001 podStartE2EDuration="1.566840699s" podCreationTimestamp="2025-11-26 12:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:14:59.563403773 +0000 UTC m=+197.470617124" watchObservedRunningTime="2025-11-26 12:14:59.566840699 +0000 UTC m=+197.474054051" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.600452 4834 scope.go:117] "RemoveContainer" containerID="7d0ce19d3bc307943fb0dcff53da8d566d320ca5886cd59e20df9427ff82197c" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.606232 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcv95"] Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.609366 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcv95"] Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.616226 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr48n"] Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.618407 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr48n"] Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.620224 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qsb7t"] Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.623578 4834 scope.go:117] "RemoveContainer" containerID="5c3e3de614293cd0a4d2675ba6eeaba6170a045d2cdc48d410129553d7c79d37" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.626993 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qsb7t"] Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.636926 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqj69"] Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.638200 4834 scope.go:117] "RemoveContainer" containerID="e7b02f27acee5c74e7a2daa61db423e19e2d57e0938e16a1562cf2e399a99367" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.639244 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mqj69"] Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.648955 4834 scope.go:117] "RemoveContainer" containerID="40d89df39d5041b71cd47bb6855a15561cf7c89392d0591f79f4b554d6b8954a" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.662486 4834 scope.go:117] "RemoveContainer" containerID="32ed09df8c774278ee303e749d111b7f344dc2a20d27555ca24c8f43947d88c3" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.675614 4834 scope.go:117] "RemoveContainer" containerID="84d9f5f513e8047d2abde8e0876bc7fb60bf707beb670c263aa4bd90535426c9" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.692448 4834 scope.go:117] "RemoveContainer" containerID="9122400632aa0ac0c6650a1a84fc2f34a2b8827a0bee9d8f95be249fbd3385f2" Nov 26 12:14:59 crc kubenswrapper[4834]: I1126 12:14:59.704833 4834 scope.go:117] "RemoveContainer" containerID="c7252708fed4ed3bb8dd6be93e30dc70fa08953dc47f7b41f9e430089337b23b" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126051 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj"] Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126258 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerName="extract-content" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126273 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerName="extract-content" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126282 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126287 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126297 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14803438-3861-4da4-a23f-798966cbb5e4" containerName="extract-utilities" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126302 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="14803438-3861-4da4-a23f-798966cbb5e4" containerName="extract-utilities" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126324 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerName="extract-content" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126330 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerName="extract-content" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126338 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126344 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126350 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14803438-3861-4da4-a23f-798966cbb5e4" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126356 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="14803438-3861-4da4-a23f-798966cbb5e4" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126363 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerName="extract-utilities" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126370 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerName="extract-utilities" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126377 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerName="extract-utilities" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126384 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerName="extract-utilities" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126391 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c485dff-89c6-4d40-8ba4-3b69ac68820e" containerName="marketplace-operator" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126396 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c485dff-89c6-4d40-8ba4-3b69ac68820e" containerName="marketplace-operator" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126405 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14803438-3861-4da4-a23f-798966cbb5e4" containerName="extract-content" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126410 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="14803438-3861-4da4-a23f-798966cbb5e4" containerName="extract-content" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126418 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126423 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126431 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerName="extract-content" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126436 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerName="extract-content" Nov 26 12:15:00 crc kubenswrapper[4834]: E1126 12:15:00.126445 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerName="extract-utilities" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126450 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerName="extract-utilities" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126548 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126564 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c485dff-89c6-4d40-8ba4-3b69ac68820e" containerName="marketplace-operator" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126571 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="498090a1-7333-4c7f-b48b-ae3ea0166165" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126578 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="43791ad6-52e3-4dff-8b3b-3098c3011e85" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126585 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="14803438-3861-4da4-a23f-798966cbb5e4" containerName="registry-server" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.126994 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.128556 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.128611 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.134062 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj"] Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.201390 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rt9bt"] Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.202397 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.205064 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.210187 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rt9bt"] Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.275155 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j45hg\" (UniqueName: \"kubernetes.io/projected/25328d2d-3fa8-4a87-bc53-dc9088802bbf-kube-api-access-j45hg\") pod \"collect-profiles-29402655-trzcj\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.275214 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/073c7670-7cb5-4160-b6f2-d301f594dd00-catalog-content\") pod \"redhat-marketplace-rt9bt\" (UID: \"073c7670-7cb5-4160-b6f2-d301f594dd00\") " pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.275257 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl4xv\" (UniqueName: \"kubernetes.io/projected/073c7670-7cb5-4160-b6f2-d301f594dd00-kube-api-access-dl4xv\") pod \"redhat-marketplace-rt9bt\" (UID: \"073c7670-7cb5-4160-b6f2-d301f594dd00\") " pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.275283 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25328d2d-3fa8-4a87-bc53-dc9088802bbf-config-volume\") pod \"collect-profiles-29402655-trzcj\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.275321 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/073c7670-7cb5-4160-b6f2-d301f594dd00-utilities\") pod \"redhat-marketplace-rt9bt\" (UID: \"073c7670-7cb5-4160-b6f2-d301f594dd00\") " pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.275511 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25328d2d-3fa8-4a87-bc53-dc9088802bbf-secret-volume\") pod \"collect-profiles-29402655-trzcj\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.376367 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/073c7670-7cb5-4160-b6f2-d301f594dd00-catalog-content\") pod \"redhat-marketplace-rt9bt\" (UID: \"073c7670-7cb5-4160-b6f2-d301f594dd00\") " pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.376422 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl4xv\" (UniqueName: \"kubernetes.io/projected/073c7670-7cb5-4160-b6f2-d301f594dd00-kube-api-access-dl4xv\") pod \"redhat-marketplace-rt9bt\" (UID: \"073c7670-7cb5-4160-b6f2-d301f594dd00\") " pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.376442 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25328d2d-3fa8-4a87-bc53-dc9088802bbf-config-volume\") pod \"collect-profiles-29402655-trzcj\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.376904 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/073c7670-7cb5-4160-b6f2-d301f594dd00-utilities\") pod \"redhat-marketplace-rt9bt\" (UID: \"073c7670-7cb5-4160-b6f2-d301f594dd00\") " pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.376900 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/073c7670-7cb5-4160-b6f2-d301f594dd00-catalog-content\") pod \"redhat-marketplace-rt9bt\" (UID: \"073c7670-7cb5-4160-b6f2-d301f594dd00\") " pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.377039 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25328d2d-3fa8-4a87-bc53-dc9088802bbf-secret-volume\") pod \"collect-profiles-29402655-trzcj\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.377129 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/073c7670-7cb5-4160-b6f2-d301f594dd00-utilities\") pod \"redhat-marketplace-rt9bt\" (UID: \"073c7670-7cb5-4160-b6f2-d301f594dd00\") " pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.377136 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j45hg\" (UniqueName: \"kubernetes.io/projected/25328d2d-3fa8-4a87-bc53-dc9088802bbf-kube-api-access-j45hg\") pod \"collect-profiles-29402655-trzcj\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.377450 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25328d2d-3fa8-4a87-bc53-dc9088802bbf-config-volume\") pod \"collect-profiles-29402655-trzcj\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.389098 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25328d2d-3fa8-4a87-bc53-dc9088802bbf-secret-volume\") pod \"collect-profiles-29402655-trzcj\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.393200 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl4xv\" (UniqueName: \"kubernetes.io/projected/073c7670-7cb5-4160-b6f2-d301f594dd00-kube-api-access-dl4xv\") pod \"redhat-marketplace-rt9bt\" (UID: \"073c7670-7cb5-4160-b6f2-d301f594dd00\") " pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.393440 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j45hg\" (UniqueName: \"kubernetes.io/projected/25328d2d-3fa8-4a87-bc53-dc9088802bbf-kube-api-access-j45hg\") pod \"collect-profiles-29402655-trzcj\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.399518 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nb8ks"] Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.401258 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.405555 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.413079 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nb8ks"] Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.424184 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14803438-3861-4da4-a23f-798966cbb5e4" path="/var/lib/kubelet/pods/14803438-3861-4da4-a23f-798966cbb5e4/volumes" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.424869 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43791ad6-52e3-4dff-8b3b-3098c3011e85" path="/var/lib/kubelet/pods/43791ad6-52e3-4dff-8b3b-3098c3011e85/volumes" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.425443 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="498090a1-7333-4c7f-b48b-ae3ea0166165" path="/var/lib/kubelet/pods/498090a1-7333-4c7f-b48b-ae3ea0166165/volumes" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.426456 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c485dff-89c6-4d40-8ba4-3b69ac68820e" path="/var/lib/kubelet/pods/6c485dff-89c6-4d40-8ba4-3b69ac68820e/volumes" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.426896 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df50fe7e-b188-43c2-bfeb-e82b35741ad1" path="/var/lib/kubelet/pods/df50fe7e-b188-43c2-bfeb-e82b35741ad1/volumes" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.438793 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.477715 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/923697b4-3ab4-4e51-8f10-501c3c2cdff6-utilities\") pod \"redhat-operators-nb8ks\" (UID: \"923697b4-3ab4-4e51-8f10-501c3c2cdff6\") " pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.477760 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk4vm\" (UniqueName: \"kubernetes.io/projected/923697b4-3ab4-4e51-8f10-501c3c2cdff6-kube-api-access-pk4vm\") pod \"redhat-operators-nb8ks\" (UID: \"923697b4-3ab4-4e51-8f10-501c3c2cdff6\") " pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.477788 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/923697b4-3ab4-4e51-8f10-501c3c2cdff6-catalog-content\") pod \"redhat-operators-nb8ks\" (UID: \"923697b4-3ab4-4e51-8f10-501c3c2cdff6\") " pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.515175 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.578477 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/923697b4-3ab4-4e51-8f10-501c3c2cdff6-utilities\") pod \"redhat-operators-nb8ks\" (UID: \"923697b4-3ab4-4e51-8f10-501c3c2cdff6\") " pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.578616 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk4vm\" (UniqueName: \"kubernetes.io/projected/923697b4-3ab4-4e51-8f10-501c3c2cdff6-kube-api-access-pk4vm\") pod \"redhat-operators-nb8ks\" (UID: \"923697b4-3ab4-4e51-8f10-501c3c2cdff6\") " pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.578639 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/923697b4-3ab4-4e51-8f10-501c3c2cdff6-catalog-content\") pod \"redhat-operators-nb8ks\" (UID: \"923697b4-3ab4-4e51-8f10-501c3c2cdff6\") " pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.579003 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/923697b4-3ab4-4e51-8f10-501c3c2cdff6-catalog-content\") pod \"redhat-operators-nb8ks\" (UID: \"923697b4-3ab4-4e51-8f10-501c3c2cdff6\") " pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.579201 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/923697b4-3ab4-4e51-8f10-501c3c2cdff6-utilities\") pod \"redhat-operators-nb8ks\" (UID: \"923697b4-3ab4-4e51-8f10-501c3c2cdff6\") " pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.595775 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk4vm\" (UniqueName: \"kubernetes.io/projected/923697b4-3ab4-4e51-8f10-501c3c2cdff6-kube-api-access-pk4vm\") pod \"redhat-operators-nb8ks\" (UID: \"923697b4-3ab4-4e51-8f10-501c3c2cdff6\") " pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.735197 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.843171 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj"] Nov 26 12:15:00 crc kubenswrapper[4834]: I1126 12:15:00.928058 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rt9bt"] Nov 26 12:15:00 crc kubenswrapper[4834]: W1126 12:15:00.936362 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073c7670_7cb5_4160_b6f2_d301f594dd00.slice/crio-b49da64b6d191d095174dba5efe43fd37e05fc3d8831d7a0e9ace7d8700ea59f WatchSource:0}: Error finding container b49da64b6d191d095174dba5efe43fd37e05fc3d8831d7a0e9ace7d8700ea59f: Status 404 returned error can't find the container with id b49da64b6d191d095174dba5efe43fd37e05fc3d8831d7a0e9ace7d8700ea59f Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.130102 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nb8ks"] Nov 26 12:15:01 crc kubenswrapper[4834]: W1126 12:15:01.166334 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923697b4_3ab4_4e51_8f10_501c3c2cdff6.slice/crio-29595ee30e12e4ca4b540b0aea707110375f8b7bc577cdcec8f5387ca2bdecaf WatchSource:0}: Error finding container 29595ee30e12e4ca4b540b0aea707110375f8b7bc577cdcec8f5387ca2bdecaf: Status 404 returned error can't find the container with id 29595ee30e12e4ca4b540b0aea707110375f8b7bc577cdcec8f5387ca2bdecaf Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.564277 4834 generic.go:334] "Generic (PLEG): container finished" podID="25328d2d-3fa8-4a87-bc53-dc9088802bbf" containerID="a99a0d7694cf5ce97ba92bfb1f9fe7f9ba9860a0a956a28a24fd65dd7bc355f0" exitCode=0 Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.564351 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" event={"ID":"25328d2d-3fa8-4a87-bc53-dc9088802bbf","Type":"ContainerDied","Data":"a99a0d7694cf5ce97ba92bfb1f9fe7f9ba9860a0a956a28a24fd65dd7bc355f0"} Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.564404 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" event={"ID":"25328d2d-3fa8-4a87-bc53-dc9088802bbf","Type":"ContainerStarted","Data":"d0f0cac434e84d2638cdf78208ae2d79322d85e55377b632acb650912aaacd3f"} Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.569702 4834 generic.go:334] "Generic (PLEG): container finished" podID="923697b4-3ab4-4e51-8f10-501c3c2cdff6" containerID="6909ed3760da1051e5856c80c790ad8865b2a1f9f3dd5818617ecbfb2c8c38ec" exitCode=0 Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.569759 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8ks" event={"ID":"923697b4-3ab4-4e51-8f10-501c3c2cdff6","Type":"ContainerDied","Data":"6909ed3760da1051e5856c80c790ad8865b2a1f9f3dd5818617ecbfb2c8c38ec"} Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.569821 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8ks" event={"ID":"923697b4-3ab4-4e51-8f10-501c3c2cdff6","Type":"ContainerStarted","Data":"29595ee30e12e4ca4b540b0aea707110375f8b7bc577cdcec8f5387ca2bdecaf"} Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.571552 4834 generic.go:334] "Generic (PLEG): container finished" podID="073c7670-7cb5-4160-b6f2-d301f594dd00" containerID="245571e5b6250447168e1ef249c41870fd9349a12c53888251dde7f9363d1a57" exitCode=0 Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.571699 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rt9bt" event={"ID":"073c7670-7cb5-4160-b6f2-d301f594dd00","Type":"ContainerDied","Data":"245571e5b6250447168e1ef249c41870fd9349a12c53888251dde7f9363d1a57"} Nov 26 12:15:01 crc kubenswrapper[4834]: I1126 12:15:01.571784 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rt9bt" event={"ID":"073c7670-7cb5-4160-b6f2-d301f594dd00","Type":"ContainerStarted","Data":"b49da64b6d191d095174dba5efe43fd37e05fc3d8831d7a0e9ace7d8700ea59f"} Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.606186 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-99nqk"] Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.609566 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.610859 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzwqh\" (UniqueName: \"kubernetes.io/projected/1816ee93-0f51-41b5-9763-03a43aa4b6a7-kube-api-access-rzwqh\") pod \"certified-operators-99nqk\" (UID: \"1816ee93-0f51-41b5-9763-03a43aa4b6a7\") " pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.610904 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1816ee93-0f51-41b5-9763-03a43aa4b6a7-catalog-content\") pod \"certified-operators-99nqk\" (UID: \"1816ee93-0f51-41b5-9763-03a43aa4b6a7\") " pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.610938 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1816ee93-0f51-41b5-9763-03a43aa4b6a7-utilities\") pod \"certified-operators-99nqk\" (UID: \"1816ee93-0f51-41b5-9763-03a43aa4b6a7\") " pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.612486 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.616624 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-99nqk"] Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.711601 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzwqh\" (UniqueName: \"kubernetes.io/projected/1816ee93-0f51-41b5-9763-03a43aa4b6a7-kube-api-access-rzwqh\") pod \"certified-operators-99nqk\" (UID: \"1816ee93-0f51-41b5-9763-03a43aa4b6a7\") " pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.711640 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1816ee93-0f51-41b5-9763-03a43aa4b6a7-catalog-content\") pod \"certified-operators-99nqk\" (UID: \"1816ee93-0f51-41b5-9763-03a43aa4b6a7\") " pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.711676 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1816ee93-0f51-41b5-9763-03a43aa4b6a7-utilities\") pod \"certified-operators-99nqk\" (UID: \"1816ee93-0f51-41b5-9763-03a43aa4b6a7\") " pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.712034 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1816ee93-0f51-41b5-9763-03a43aa4b6a7-utilities\") pod \"certified-operators-99nqk\" (UID: \"1816ee93-0f51-41b5-9763-03a43aa4b6a7\") " pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.712259 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1816ee93-0f51-41b5-9763-03a43aa4b6a7-catalog-content\") pod \"certified-operators-99nqk\" (UID: \"1816ee93-0f51-41b5-9763-03a43aa4b6a7\") " pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.729584 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzwqh\" (UniqueName: \"kubernetes.io/projected/1816ee93-0f51-41b5-9763-03a43aa4b6a7-kube-api-access-rzwqh\") pod \"certified-operators-99nqk\" (UID: \"1816ee93-0f51-41b5-9763-03a43aa4b6a7\") " pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.810541 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l9v6t"] Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.811672 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.815392 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.835733 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9v6t"] Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.914104 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d5b15f7-54f3-4bde-b304-4e803caf4309-utilities\") pod \"community-operators-l9v6t\" (UID: \"0d5b15f7-54f3-4bde-b304-4e803caf4309\") " pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.914157 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d5b15f7-54f3-4bde-b304-4e803caf4309-catalog-content\") pod \"community-operators-l9v6t\" (UID: \"0d5b15f7-54f3-4bde-b304-4e803caf4309\") " pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.914177 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkwsj\" (UniqueName: \"kubernetes.io/projected/0d5b15f7-54f3-4bde-b304-4e803caf4309-kube-api-access-vkwsj\") pod \"community-operators-l9v6t\" (UID: \"0d5b15f7-54f3-4bde-b304-4e803caf4309\") " pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.925547 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:02 crc kubenswrapper[4834]: I1126 12:15:02.953835 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.017154 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d5b15f7-54f3-4bde-b304-4e803caf4309-utilities\") pod \"community-operators-l9v6t\" (UID: \"0d5b15f7-54f3-4bde-b304-4e803caf4309\") " pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.017197 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d5b15f7-54f3-4bde-b304-4e803caf4309-catalog-content\") pod \"community-operators-l9v6t\" (UID: \"0d5b15f7-54f3-4bde-b304-4e803caf4309\") " pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.017217 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkwsj\" (UniqueName: \"kubernetes.io/projected/0d5b15f7-54f3-4bde-b304-4e803caf4309-kube-api-access-vkwsj\") pod \"community-operators-l9v6t\" (UID: \"0d5b15f7-54f3-4bde-b304-4e803caf4309\") " pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.017760 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d5b15f7-54f3-4bde-b304-4e803caf4309-catalog-content\") pod \"community-operators-l9v6t\" (UID: \"0d5b15f7-54f3-4bde-b304-4e803caf4309\") " pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.017999 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d5b15f7-54f3-4bde-b304-4e803caf4309-utilities\") pod \"community-operators-l9v6t\" (UID: \"0d5b15f7-54f3-4bde-b304-4e803caf4309\") " pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.034670 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkwsj\" (UniqueName: \"kubernetes.io/projected/0d5b15f7-54f3-4bde-b304-4e803caf4309-kube-api-access-vkwsj\") pod \"community-operators-l9v6t\" (UID: \"0d5b15f7-54f3-4bde-b304-4e803caf4309\") " pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.118372 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25328d2d-3fa8-4a87-bc53-dc9088802bbf-config-volume\") pod \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.118464 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25328d2d-3fa8-4a87-bc53-dc9088802bbf-secret-volume\") pod \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.118557 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j45hg\" (UniqueName: \"kubernetes.io/projected/25328d2d-3fa8-4a87-bc53-dc9088802bbf-kube-api-access-j45hg\") pod \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\" (UID: \"25328d2d-3fa8-4a87-bc53-dc9088802bbf\") " Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.119957 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25328d2d-3fa8-4a87-bc53-dc9088802bbf-config-volume" (OuterVolumeSpecName: "config-volume") pod "25328d2d-3fa8-4a87-bc53-dc9088802bbf" (UID: "25328d2d-3fa8-4a87-bc53-dc9088802bbf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.122592 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25328d2d-3fa8-4a87-bc53-dc9088802bbf-kube-api-access-j45hg" (OuterVolumeSpecName: "kube-api-access-j45hg") pod "25328d2d-3fa8-4a87-bc53-dc9088802bbf" (UID: "25328d2d-3fa8-4a87-bc53-dc9088802bbf"). InnerVolumeSpecName "kube-api-access-j45hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.122715 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25328d2d-3fa8-4a87-bc53-dc9088802bbf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "25328d2d-3fa8-4a87-bc53-dc9088802bbf" (UID: "25328d2d-3fa8-4a87-bc53-dc9088802bbf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.150415 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.220486 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j45hg\" (UniqueName: \"kubernetes.io/projected/25328d2d-3fa8-4a87-bc53-dc9088802bbf-kube-api-access-j45hg\") on node \"crc\" DevicePath \"\"" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.220530 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25328d2d-3fa8-4a87-bc53-dc9088802bbf-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.220540 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25328d2d-3fa8-4a87-bc53-dc9088802bbf-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.307295 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-99nqk"] Nov 26 12:15:03 crc kubenswrapper[4834]: W1126 12:15:03.312388 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1816ee93_0f51_41b5_9763_03a43aa4b6a7.slice/crio-a389eb2760f1ca68baafeafe9c4d0a6ab9566b4dedf0649e97a7818e73fee0cc WatchSource:0}: Error finding container a389eb2760f1ca68baafeafe9c4d0a6ab9566b4dedf0649e97a7818e73fee0cc: Status 404 returned error can't find the container with id a389eb2760f1ca68baafeafe9c4d0a6ab9566b4dedf0649e97a7818e73fee0cc Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.499909 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9v6t"] Nov 26 12:15:03 crc kubenswrapper[4834]: W1126 12:15:03.507429 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d5b15f7_54f3_4bde_b304_4e803caf4309.slice/crio-0275c559272970c8e9568b76cf4eb1a58b3ca305bcacd9269b6b22231092654c WatchSource:0}: Error finding container 0275c559272970c8e9568b76cf4eb1a58b3ca305bcacd9269b6b22231092654c: Status 404 returned error can't find the container with id 0275c559272970c8e9568b76cf4eb1a58b3ca305bcacd9269b6b22231092654c Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.585105 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" event={"ID":"25328d2d-3fa8-4a87-bc53-dc9088802bbf","Type":"ContainerDied","Data":"d0f0cac434e84d2638cdf78208ae2d79322d85e55377b632acb650912aaacd3f"} Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.585148 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.585247 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0f0cac434e84d2638cdf78208ae2d79322d85e55377b632acb650912aaacd3f" Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.586554 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9v6t" event={"ID":"0d5b15f7-54f3-4bde-b304-4e803caf4309","Type":"ContainerStarted","Data":"0275c559272970c8e9568b76cf4eb1a58b3ca305bcacd9269b6b22231092654c"} Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.589422 4834 generic.go:334] "Generic (PLEG): container finished" podID="1816ee93-0f51-41b5-9763-03a43aa4b6a7" containerID="a3c8a1f25887959c9c4cecca6ce19ffedcf80ac1934c9b1bc5788f6b6d12793d" exitCode=0 Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.589524 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99nqk" event={"ID":"1816ee93-0f51-41b5-9763-03a43aa4b6a7","Type":"ContainerDied","Data":"a3c8a1f25887959c9c4cecca6ce19ffedcf80ac1934c9b1bc5788f6b6d12793d"} Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.589559 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99nqk" event={"ID":"1816ee93-0f51-41b5-9763-03a43aa4b6a7","Type":"ContainerStarted","Data":"a389eb2760f1ca68baafeafe9c4d0a6ab9566b4dedf0649e97a7818e73fee0cc"} Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.592333 4834 generic.go:334] "Generic (PLEG): container finished" podID="923697b4-3ab4-4e51-8f10-501c3c2cdff6" containerID="22d6f39180505ef4b34bc380ca504b2896ab147b31b734e2e3856f53894a522a" exitCode=0 Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.592409 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8ks" event={"ID":"923697b4-3ab4-4e51-8f10-501c3c2cdff6","Type":"ContainerDied","Data":"22d6f39180505ef4b34bc380ca504b2896ab147b31b734e2e3856f53894a522a"} Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.594033 4834 generic.go:334] "Generic (PLEG): container finished" podID="073c7670-7cb5-4160-b6f2-d301f594dd00" containerID="c624e5d5cffefbdee8c278818e4affd07c06c1e87bed6a090dc6831e7f30426b" exitCode=0 Nov 26 12:15:03 crc kubenswrapper[4834]: I1126 12:15:03.594065 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rt9bt" event={"ID":"073c7670-7cb5-4160-b6f2-d301f594dd00","Type":"ContainerDied","Data":"c624e5d5cffefbdee8c278818e4affd07c06c1e87bed6a090dc6831e7f30426b"} Nov 26 12:15:04 crc kubenswrapper[4834]: I1126 12:15:04.603745 4834 generic.go:334] "Generic (PLEG): container finished" podID="0d5b15f7-54f3-4bde-b304-4e803caf4309" containerID="6330a88a0d93bb10e8ecc754ee1981742c95ee213764e5b52b3c9947722bffce" exitCode=0 Nov 26 12:15:04 crc kubenswrapper[4834]: I1126 12:15:04.603779 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9v6t" event={"ID":"0d5b15f7-54f3-4bde-b304-4e803caf4309","Type":"ContainerDied","Data":"6330a88a0d93bb10e8ecc754ee1981742c95ee213764e5b52b3c9947722bffce"} Nov 26 12:15:04 crc kubenswrapper[4834]: I1126 12:15:04.611723 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8ks" event={"ID":"923697b4-3ab4-4e51-8f10-501c3c2cdff6","Type":"ContainerStarted","Data":"fb0bb24290f05862c84380c6663000209bbb224d2b6320d8cd457c5e08f264dd"} Nov 26 12:15:04 crc kubenswrapper[4834]: I1126 12:15:04.624005 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rt9bt" event={"ID":"073c7670-7cb5-4160-b6f2-d301f594dd00","Type":"ContainerStarted","Data":"c98872a803a5978f5491d18774dfe216bcb9a1e5b26029db42422df9cecaa778"} Nov 26 12:15:04 crc kubenswrapper[4834]: I1126 12:15:04.638384 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nb8ks" podStartSLOduration=1.9918220340000001 podStartE2EDuration="4.638367742s" podCreationTimestamp="2025-11-26 12:15:00 +0000 UTC" firstStartedPulling="2025-11-26 12:15:01.573861335 +0000 UTC m=+199.481074687" lastFinishedPulling="2025-11-26 12:15:04.220407042 +0000 UTC m=+202.127620395" observedRunningTime="2025-11-26 12:15:04.638138271 +0000 UTC m=+202.545351623" watchObservedRunningTime="2025-11-26 12:15:04.638367742 +0000 UTC m=+202.545581094" Nov 26 12:15:04 crc kubenswrapper[4834]: I1126 12:15:04.652306 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rt9bt" podStartSLOduration=2.035274093 podStartE2EDuration="4.65229276s" podCreationTimestamp="2025-11-26 12:15:00 +0000 UTC" firstStartedPulling="2025-11-26 12:15:01.574174105 +0000 UTC m=+199.481387457" lastFinishedPulling="2025-11-26 12:15:04.191192772 +0000 UTC m=+202.098406124" observedRunningTime="2025-11-26 12:15:04.650758235 +0000 UTC m=+202.557971587" watchObservedRunningTime="2025-11-26 12:15:04.65229276 +0000 UTC m=+202.559506112" Nov 26 12:15:05 crc kubenswrapper[4834]: I1126 12:15:05.631097 4834 generic.go:334] "Generic (PLEG): container finished" podID="1816ee93-0f51-41b5-9763-03a43aa4b6a7" containerID="0d2464fb1418922dbcefbab02f939418d2a76b8ef9b37b9f562830be53df90b7" exitCode=0 Nov 26 12:15:05 crc kubenswrapper[4834]: I1126 12:15:05.631164 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99nqk" event={"ID":"1816ee93-0f51-41b5-9763-03a43aa4b6a7","Type":"ContainerDied","Data":"0d2464fb1418922dbcefbab02f939418d2a76b8ef9b37b9f562830be53df90b7"} Nov 26 12:15:06 crc kubenswrapper[4834]: I1126 12:15:06.638241 4834 generic.go:334] "Generic (PLEG): container finished" podID="0d5b15f7-54f3-4bde-b304-4e803caf4309" containerID="9a2588964196a1c4f54657c4f6f02d131346fd275eff1d9610eb02cd786b2f08" exitCode=0 Nov 26 12:15:06 crc kubenswrapper[4834]: I1126 12:15:06.638281 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9v6t" event={"ID":"0d5b15f7-54f3-4bde-b304-4e803caf4309","Type":"ContainerDied","Data":"9a2588964196a1c4f54657c4f6f02d131346fd275eff1d9610eb02cd786b2f08"} Nov 26 12:15:06 crc kubenswrapper[4834]: I1126 12:15:06.641577 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99nqk" event={"ID":"1816ee93-0f51-41b5-9763-03a43aa4b6a7","Type":"ContainerStarted","Data":"82823d34657818ef0023ca56a9e8051b7c212021e89425ddaf281ecbeab09844"} Nov 26 12:15:06 crc kubenswrapper[4834]: I1126 12:15:06.676249 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-99nqk" podStartSLOduration=1.816133696 podStartE2EDuration="4.676228594s" podCreationTimestamp="2025-11-26 12:15:02 +0000 UTC" firstStartedPulling="2025-11-26 12:15:03.59793869 +0000 UTC m=+201.505152043" lastFinishedPulling="2025-11-26 12:15:06.458033589 +0000 UTC m=+204.365246941" observedRunningTime="2025-11-26 12:15:06.673358456 +0000 UTC m=+204.580571808" watchObservedRunningTime="2025-11-26 12:15:06.676228594 +0000 UTC m=+204.583441946" Nov 26 12:15:07 crc kubenswrapper[4834]: I1126 12:15:07.650276 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9v6t" event={"ID":"0d5b15f7-54f3-4bde-b304-4e803caf4309","Type":"ContainerStarted","Data":"36e7916ccc34f6cd3a9462a282fd334f46edc0053723bf51a67cb5821bcb4e59"} Nov 26 12:15:07 crc kubenswrapper[4834]: I1126 12:15:07.666743 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l9v6t" podStartSLOduration=3.055970317 podStartE2EDuration="5.666729338s" podCreationTimestamp="2025-11-26 12:15:02 +0000 UTC" firstStartedPulling="2025-11-26 12:15:04.605389411 +0000 UTC m=+202.512602764" lastFinishedPulling="2025-11-26 12:15:07.216148433 +0000 UTC m=+205.123361785" observedRunningTime="2025-11-26 12:15:07.664266817 +0000 UTC m=+205.571480169" watchObservedRunningTime="2025-11-26 12:15:07.666729338 +0000 UTC m=+205.573942690" Nov 26 12:15:10 crc kubenswrapper[4834]: I1126 12:15:10.515505 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:10 crc kubenswrapper[4834]: I1126 12:15:10.515885 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:10 crc kubenswrapper[4834]: I1126 12:15:10.557724 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:10 crc kubenswrapper[4834]: I1126 12:15:10.694790 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rt9bt" Nov 26 12:15:10 crc kubenswrapper[4834]: I1126 12:15:10.736110 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:10 crc kubenswrapper[4834]: I1126 12:15:10.736143 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:10 crc kubenswrapper[4834]: I1126 12:15:10.765410 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:11 crc kubenswrapper[4834]: I1126 12:15:11.702201 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nb8ks" Nov 26 12:15:12 crc kubenswrapper[4834]: I1126 12:15:12.955379 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:12 crc kubenswrapper[4834]: I1126 12:15:12.955454 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:12 crc kubenswrapper[4834]: I1126 12:15:12.989370 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:13 crc kubenswrapper[4834]: I1126 12:15:13.151284 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:13 crc kubenswrapper[4834]: I1126 12:15:13.151366 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:13 crc kubenswrapper[4834]: I1126 12:15:13.187213 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:13 crc kubenswrapper[4834]: I1126 12:15:13.713361 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-99nqk" Nov 26 12:15:13 crc kubenswrapper[4834]: I1126 12:15:13.713688 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l9v6t" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.197339 4834 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.197827 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25328d2d-3fa8-4a87-bc53-dc9088802bbf" containerName="collect-profiles" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.197840 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="25328d2d-3fa8-4a87-bc53-dc9088802bbf" containerName="collect-profiles" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.197927 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="25328d2d-3fa8-4a87-bc53-dc9088802bbf" containerName="collect-profiles" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.198525 4834 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.198708 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.198959 4834 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199059 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8" gracePeriod=15 Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199145 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0" gracePeriod=15 Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.199211 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199161 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d" gracePeriod=15 Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199163 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8" gracePeriod=15 Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.198924 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72" gracePeriod=15 Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199225 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.199403 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199427 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.199448 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199455 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.199463 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199468 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.199481 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199488 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.199496 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199501 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199707 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199719 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199729 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199738 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199745 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199753 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.199847 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.199855 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.201912 4834 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.211466 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.211501 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.211552 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.211578 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.211599 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.241627 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313144 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313226 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313304 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313428 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313511 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313613 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313635 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313706 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313725 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313762 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313813 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.313948 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.314055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.392267 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.393234 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.393717 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.394080 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.394549 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.394595 4834 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.394884 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="200ms" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.414950 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.415024 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.415061 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.415145 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.415187 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.415208 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.537659 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:15:17 crc kubenswrapper[4834]: W1126 12:15:17.561081 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5b3706ac5b02b7496a17f3d990f936f5352ed1a9d833122081a4e01920aeff86 WatchSource:0}: Error finding container 5b3706ac5b02b7496a17f3d990f936f5352ed1a9d833122081a4e01920aeff86: Status 404 returned error can't find the container with id 5b3706ac5b02b7496a17f3d990f936f5352ed1a9d833122081a4e01920aeff86 Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.563766 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.148:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b8d8d6a8d099e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 12:15:17.563214238 +0000 UTC m=+215.470427590,LastTimestamp:2025-11-26 12:15:17.563214238 +0000 UTC m=+215.470427590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.595765 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="400ms" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.708453 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.709686 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.710342 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72" exitCode=0 Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.710374 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0" exitCode=0 Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.710382 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8" exitCode=0 Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.710390 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d" exitCode=2 Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.710466 4834 scope.go:117] "RemoveContainer" containerID="4e9e77237325869febcf524a7d29dd575d35f24025c8ec28fcc6755c33d83d26" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.713167 4834 generic.go:334] "Generic (PLEG): container finished" podID="54a38ab2-63cf-4044-8d03-1d84672f0f10" containerID="8af481aaa6b3abfa36657e06b426191a105468de9f5fae17a7487febcd2cd933" exitCode=0 Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.713224 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"54a38ab2-63cf-4044-8d03-1d84672f0f10","Type":"ContainerDied","Data":"8af481aaa6b3abfa36657e06b426191a105468de9f5fae17a7487febcd2cd933"} Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.713935 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.714115 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: I1126 12:15:17.715387 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5b3706ac5b02b7496a17f3d990f936f5352ed1a9d833122081a4e01920aeff86"} Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.963955 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:15:17Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:15:17Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:15:17Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T12:15:17Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.964542 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.965091 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.965331 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.965618 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.965663 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 12:15:17 crc kubenswrapper[4834]: E1126 12:15:17.996920 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="800ms" Nov 26 12:15:18 crc kubenswrapper[4834]: I1126 12:15:18.723381 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"89a6730dd1fedb33e1544044b281e7e9cb541ba590722dd732e6472ad7be0ab4"} Nov 26 12:15:18 crc kubenswrapper[4834]: I1126 12:15:18.724947 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:18 crc kubenswrapper[4834]: I1126 12:15:18.725246 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:18 crc kubenswrapper[4834]: I1126 12:15:18.727025 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 12:15:18 crc kubenswrapper[4834]: E1126 12:15:18.797876 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="1.6s" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.028388 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.028905 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.029412 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.131557 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a38ab2-63cf-4044-8d03-1d84672f0f10-kube-api-access\") pod \"54a38ab2-63cf-4044-8d03-1d84672f0f10\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.131625 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-var-lock\") pod \"54a38ab2-63cf-4044-8d03-1d84672f0f10\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.131704 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-kubelet-dir\") pod \"54a38ab2-63cf-4044-8d03-1d84672f0f10\" (UID: \"54a38ab2-63cf-4044-8d03-1d84672f0f10\") " Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.131989 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "54a38ab2-63cf-4044-8d03-1d84672f0f10" (UID: "54a38ab2-63cf-4044-8d03-1d84672f0f10"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.132124 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-var-lock" (OuterVolumeSpecName: "var-lock") pod "54a38ab2-63cf-4044-8d03-1d84672f0f10" (UID: "54a38ab2-63cf-4044-8d03-1d84672f0f10"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.135800 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a38ab2-63cf-4044-8d03-1d84672f0f10-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "54a38ab2-63cf-4044-8d03-1d84672f0f10" (UID: "54a38ab2-63cf-4044-8d03-1d84672f0f10"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.233422 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a38ab2-63cf-4044-8d03-1d84672f0f10-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.233463 4834 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.233473 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a38ab2-63cf-4044-8d03-1d84672f0f10-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.533452 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.545078 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.545635 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.546449 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.546694 4834 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.639046 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.639190 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.639221 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.639184 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.639291 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.639438 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.639569 4834 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.639586 4834 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.639595 4834 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.736836 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.737473 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8" exitCode=0 Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.737583 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.737588 4834 scope.go:117] "RemoveContainer" containerID="261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.739530 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"54a38ab2-63cf-4044-8d03-1d84672f0f10","Type":"ContainerDied","Data":"c45a4c0546bc485d1546dadd65ca14019bfc60712a6b64f322923c140ff9ec3b"} Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.740202 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c45a4c0546bc485d1546dadd65ca14019bfc60712a6b64f322923c140ff9ec3b" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.739576 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.752645 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.752850 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.753068 4834 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.753301 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.753545 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.753748 4834 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.754883 4834 scope.go:117] "RemoveContainer" containerID="cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.766608 4834 scope.go:117] "RemoveContainer" containerID="01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.778154 4834 scope.go:117] "RemoveContainer" containerID="41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.788556 4834 scope.go:117] "RemoveContainer" containerID="d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.800922 4834 scope.go:117] "RemoveContainer" containerID="60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.815855 4834 scope.go:117] "RemoveContainer" containerID="261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72" Nov 26 12:15:19 crc kubenswrapper[4834]: E1126 12:15:19.816171 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\": container with ID starting with 261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72 not found: ID does not exist" containerID="261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.816203 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72"} err="failed to get container status \"261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\": rpc error: code = NotFound desc = could not find container \"261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72\": container with ID starting with 261213c96e02ad0f71ba8a2b98ee96277ecded5eab7426424d1318ee06dcfb72 not found: ID does not exist" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.816227 4834 scope.go:117] "RemoveContainer" containerID="cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0" Nov 26 12:15:19 crc kubenswrapper[4834]: E1126 12:15:19.816549 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\": container with ID starting with cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0 not found: ID does not exist" containerID="cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.816611 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0"} err="failed to get container status \"cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\": rpc error: code = NotFound desc = could not find container \"cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0\": container with ID starting with cdd879427c063e04fe3ba74df927f04af7da7f929b4104bb8582018a62077af0 not found: ID does not exist" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.816656 4834 scope.go:117] "RemoveContainer" containerID="01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8" Nov 26 12:15:19 crc kubenswrapper[4834]: E1126 12:15:19.816972 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\": container with ID starting with 01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8 not found: ID does not exist" containerID="01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.817008 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8"} err="failed to get container status \"01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\": rpc error: code = NotFound desc = could not find container \"01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8\": container with ID starting with 01c210cefcac3e6ae01c3e0c2f20a1df9f2067aea50ec4ef72891967f90950e8 not found: ID does not exist" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.817027 4834 scope.go:117] "RemoveContainer" containerID="41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d" Nov 26 12:15:19 crc kubenswrapper[4834]: E1126 12:15:19.817271 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\": container with ID starting with 41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d not found: ID does not exist" containerID="41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.817306 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d"} err="failed to get container status \"41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\": rpc error: code = NotFound desc = could not find container \"41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d\": container with ID starting with 41309e2cc8b7122a89ec2d62d14cb50894fbd903d09523a2efb5f227a078ec6d not found: ID does not exist" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.817346 4834 scope.go:117] "RemoveContainer" containerID="d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8" Nov 26 12:15:19 crc kubenswrapper[4834]: E1126 12:15:19.817614 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\": container with ID starting with d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8 not found: ID does not exist" containerID="d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.817660 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8"} err="failed to get container status \"d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\": rpc error: code = NotFound desc = could not find container \"d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8\": container with ID starting with d7c5f4634da746a2db3bdc45bf8e47aecf1e2703505d2cd361a7f4525807f8d8 not found: ID does not exist" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.817676 4834 scope.go:117] "RemoveContainer" containerID="60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0" Nov 26 12:15:19 crc kubenswrapper[4834]: E1126 12:15:19.817932 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\": container with ID starting with 60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0 not found: ID does not exist" containerID="60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0" Nov 26 12:15:19 crc kubenswrapper[4834]: I1126 12:15:19.817970 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0"} err="failed to get container status \"60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\": rpc error: code = NotFound desc = could not find container \"60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0\": container with ID starting with 60e509254b1c8ccd2abface621e71ad1525e94776c8c6c00314dc28998b997f0 not found: ID does not exist" Nov 26 12:15:20 crc kubenswrapper[4834]: E1126 12:15:20.398998 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="3.2s" Nov 26 12:15:20 crc kubenswrapper[4834]: I1126 12:15:20.423814 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 26 12:15:21 crc kubenswrapper[4834]: E1126 12:15:21.283609 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.148:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b8d8d6a8d099e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 12:15:17.563214238 +0000 UTC m=+215.470427590,LastTimestamp:2025-11-26 12:15:17.563214238 +0000 UTC m=+215.470427590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 12:15:21 crc kubenswrapper[4834]: I1126 12:15:21.530975 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:15:21 crc kubenswrapper[4834]: I1126 12:15:21.531024 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:15:21 crc kubenswrapper[4834]: I1126 12:15:21.531070 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:15:21 crc kubenswrapper[4834]: I1126 12:15:21.531554 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:15:21 crc kubenswrapper[4834]: I1126 12:15:21.531612 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef" gracePeriod=600 Nov 26 12:15:21 crc kubenswrapper[4834]: I1126 12:15:21.751559 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef" exitCode=0 Nov 26 12:15:21 crc kubenswrapper[4834]: I1126 12:15:21.751658 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef"} Nov 26 12:15:22 crc kubenswrapper[4834]: I1126 12:15:22.419101 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:22 crc kubenswrapper[4834]: I1126 12:15:22.419418 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:22 crc kubenswrapper[4834]: I1126 12:15:22.762197 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"f165beb62e23145bceea6aef6fcd553d3b72544fd470871ee30eb868544a4707"} Nov 26 12:15:22 crc kubenswrapper[4834]: I1126 12:15:22.762811 4834 status_manager.go:851] "Failed to get status for pod" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-xzb52\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:22 crc kubenswrapper[4834]: I1126 12:15:22.763100 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:22 crc kubenswrapper[4834]: I1126 12:15:22.763403 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:23 crc kubenswrapper[4834]: E1126 12:15:23.600183 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="6.4s" Nov 26 12:15:26 crc kubenswrapper[4834]: E1126 12:15:26.518835 4834 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 192.168.26.148:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" volumeName="registry-storage" Nov 26 12:15:30 crc kubenswrapper[4834]: E1126 12:15:30.000623 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.148:6443: connect: connection refused" interval="7s" Nov 26 12:15:31 crc kubenswrapper[4834]: E1126 12:15:31.284860 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.148:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b8d8d6a8d099e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 12:15:17.563214238 +0000 UTC m=+215.470427590,LastTimestamp:2025-11-26 12:15:17.563214238 +0000 UTC m=+215.470427590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 12:15:31 crc kubenswrapper[4834]: I1126 12:15:31.804432 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 26 12:15:31 crc kubenswrapper[4834]: I1126 12:15:31.804481 4834 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e" exitCode=1 Nov 26 12:15:31 crc kubenswrapper[4834]: I1126 12:15:31.804511 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e"} Nov 26 12:15:31 crc kubenswrapper[4834]: I1126 12:15:31.804832 4834 scope.go:117] "RemoveContainer" containerID="6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e" Nov 26 12:15:31 crc kubenswrapper[4834]: I1126 12:15:31.805239 4834 status_manager.go:851] "Failed to get status for pod" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-xzb52\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:31 crc kubenswrapper[4834]: I1126 12:15:31.806371 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:31 crc kubenswrapper[4834]: I1126 12:15:31.806665 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:31 crc kubenswrapper[4834]: I1126 12:15:31.806951 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.416862 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.419178 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.419796 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.420104 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.420565 4834 status_manager.go:851] "Failed to get status for pod" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-xzb52\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.421269 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.421669 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.421937 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.422241 4834 status_manager.go:851] "Failed to get status for pod" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-xzb52\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.431519 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.431546 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:32 crc kubenswrapper[4834]: E1126 12:15:32.431907 4834 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.432449 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:32 crc kubenswrapper[4834]: W1126 12:15:32.452200 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-920da43c95a27aede68c8fcb5d2e0f07394281f70075081560f65a92a6f2ac5e WatchSource:0}: Error finding container 920da43c95a27aede68c8fcb5d2e0f07394281f70075081560f65a92a6f2ac5e: Status 404 returned error can't find the container with id 920da43c95a27aede68c8fcb5d2e0f07394281f70075081560f65a92a6f2ac5e Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.812440 4834 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ea5943738245acdd33d8a5de566ce8294ca949baa1ba7f1bf82fa4577990d2f4" exitCode=0 Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.812523 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ea5943738245acdd33d8a5de566ce8294ca949baa1ba7f1bf82fa4577990d2f4"} Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.812574 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"920da43c95a27aede68c8fcb5d2e0f07394281f70075081560f65a92a6f2ac5e"} Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.812797 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.812822 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:32 crc kubenswrapper[4834]: E1126 12:15:32.813120 4834 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.813145 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.813472 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.813705 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.814067 4834 status_manager.go:851] "Failed to get status for pod" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-xzb52\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.816142 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.816200 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a50de1beed62e5ac6da917e036827b5e2ef597f6ca91d13a0ddec39c7ec702bf"} Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.817406 4834 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.817648 4834 status_manager.go:851] "Failed to get status for pod" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.817943 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:32 crc kubenswrapper[4834]: I1126 12:15:32.818230 4834 status_manager.go:851] "Failed to get status for pod" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-xzb52\": dial tcp 192.168.26.148:6443: connect: connection refused" Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.826966 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"94fb5ab1c756ea09bfa637460a417c33681c94a08ab10401668890e1209bf8cb"} Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.827438 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9aef5bbfca50bded7e95db58d4f5f6fe30de19a8bf098e7f4d451d8f70460200"} Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.827451 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"262b508475df13a6ea9d27829d27d6fb7f415deb51674427ada40d72f0261a2e"} Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.827461 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"10dc1e96a95b7a0a71f084a3e09c0f3799091692d25718e2b6a8402879d3fae2"} Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.827471 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"47c25fcaacb560234603e87deeb624618281e79b9dfe876960f2483ade0b17ee"} Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.827748 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.827860 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.827878 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.847554 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.847859 4834 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 26 12:15:33 crc kubenswrapper[4834]: I1126 12:15:33.847935 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 26 12:15:37 crc kubenswrapper[4834]: I1126 12:15:37.433531 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:37 crc kubenswrapper[4834]: I1126 12:15:37.433934 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:37 crc kubenswrapper[4834]: I1126 12:15:37.438008 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:38 crc kubenswrapper[4834]: I1126 12:15:38.772379 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:15:39 crc kubenswrapper[4834]: I1126 12:15:39.101397 4834 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:39 crc kubenswrapper[4834]: I1126 12:15:39.851511 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:39 crc kubenswrapper[4834]: I1126 12:15:39.851540 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:39 crc kubenswrapper[4834]: I1126 12:15:39.854413 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:15:40 crc kubenswrapper[4834]: I1126 12:15:40.855889 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:40 crc kubenswrapper[4834]: I1126 12:15:40.856161 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:15:42 crc kubenswrapper[4834]: I1126 12:15:42.428456 4834 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="13e1b811-2231-47ef-9d99-bf9d29400c33" Nov 26 12:15:43 crc kubenswrapper[4834]: I1126 12:15:43.848123 4834 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 26 12:15:43 crc kubenswrapper[4834]: I1126 12:15:43.848199 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 26 12:15:48 crc kubenswrapper[4834]: I1126 12:15:48.320727 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 12:15:49 crc kubenswrapper[4834]: I1126 12:15:49.306921 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 26 12:15:49 crc kubenswrapper[4834]: I1126 12:15:49.435183 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 12:15:49 crc kubenswrapper[4834]: I1126 12:15:49.448604 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 26 12:15:49 crc kubenswrapper[4834]: I1126 12:15:49.620737 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 26 12:15:49 crc kubenswrapper[4834]: I1126 12:15:49.860821 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 26 12:15:49 crc kubenswrapper[4834]: I1126 12:15:49.876563 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 26 12:15:50 crc kubenswrapper[4834]: I1126 12:15:50.136585 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 26 12:15:50 crc kubenswrapper[4834]: I1126 12:15:50.164656 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 26 12:15:50 crc kubenswrapper[4834]: I1126 12:15:50.287628 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 26 12:15:50 crc kubenswrapper[4834]: I1126 12:15:50.591420 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 26 12:15:50 crc kubenswrapper[4834]: I1126 12:15:50.747010 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 26 12:15:50 crc kubenswrapper[4834]: I1126 12:15:50.812396 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 26 12:15:50 crc kubenswrapper[4834]: I1126 12:15:50.919120 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 26 12:15:50 crc kubenswrapper[4834]: I1126 12:15:50.942441 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 26 12:15:50 crc kubenswrapper[4834]: I1126 12:15:50.994200 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.047081 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.069558 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.122838 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.372351 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.386790 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.525703 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.554575 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.555512 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.571600 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.628743 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.743082 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.752391 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.759331 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.877780 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.890592 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 26 12:15:51 crc kubenswrapper[4834]: I1126 12:15:51.899840 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.255947 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.344502 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.407238 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.445920 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.516753 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.518786 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.559501 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.617646 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.655210 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.693777 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.700372 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.706856 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.723811 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.749647 4834 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.813224 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 26 12:15:52 crc kubenswrapper[4834]: I1126 12:15:52.982151 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.006775 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.046878 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.137910 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.210463 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.289556 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.290629 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.340337 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.342252 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.352099 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.367150 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.384800 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.415027 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.431762 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.495886 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.521765 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.587994 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.601898 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.606634 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.667645 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.806295 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.847806 4834 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.847876 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.847929 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.848515 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"a50de1beed62e5ac6da917e036827b5e2ef597f6ca91d13a0ddec39c7ec702bf"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.848615 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://a50de1beed62e5ac6da917e036827b5e2ef597f6ca91d13a0ddec39c7ec702bf" gracePeriod=30 Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.904288 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 26 12:15:53 crc kubenswrapper[4834]: I1126 12:15:53.913291 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.057395 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.086203 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.410456 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.442882 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.453513 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.479026 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.507568 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.544124 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.554466 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.677096 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.832175 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 26 12:15:54 crc kubenswrapper[4834]: I1126 12:15:54.901674 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.259203 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.307529 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.369843 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.452116 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.458000 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.474197 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.585868 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.741629 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.780989 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.929563 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.955817 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.968714 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 26 12:15:55 crc kubenswrapper[4834]: I1126 12:15:55.975769 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.026788 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.197537 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.227578 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.276958 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.323880 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.393644 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.405974 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.430946 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.499446 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.557002 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.586108 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.592984 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.593171 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.664168 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.706573 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.765067 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.766645 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.771474 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.801586 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.882543 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 26 12:15:56 crc kubenswrapper[4834]: I1126 12:15:56.958459 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.085940 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.123150 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.137303 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.167663 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.171693 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.253283 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.269079 4834 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.271401 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.272233 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.322233 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.322979 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.388778 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.397891 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.412857 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.457543 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.543007 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.564091 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.602039 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.704186 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.771656 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.771883 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.784027 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.853525 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.941015 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 26 12:15:57 crc kubenswrapper[4834]: I1126 12:15:57.963018 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.006700 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.069022 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.250626 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.300550 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.380715 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.584846 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.681259 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.700370 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.730388 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.785167 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.834210 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.853176 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.862116 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.888518 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.889087 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 26 12:15:58 crc kubenswrapper[4834]: I1126 12:15:58.979918 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.119859 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.123376 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.178580 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.254586 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.395618 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.411189 4834 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.430421 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.452447 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.501454 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.740094 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.754073 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.822911 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 26 12:15:59 crc kubenswrapper[4834]: I1126 12:15:59.922055 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.010533 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.030976 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.043907 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.237604 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.244654 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.282586 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.314509 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.380966 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.488587 4834 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.520037 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.584399 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.599699 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.665225 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.690377 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.696334 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.841643 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.848263 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.962283 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 12:16:00 crc kubenswrapper[4834]: I1126 12:16:00.991071 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.135451 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.144881 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.145897 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.179692 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.236590 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.317456 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.571622 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.692386 4834 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.694927 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.694905362 podStartE2EDuration="44.694905362s" podCreationTimestamp="2025-11-26 12:15:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:15:39.156268711 +0000 UTC m=+237.063482053" watchObservedRunningTime="2025-11-26 12:16:01.694905362 +0000 UTC m=+259.602118713" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.696345 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.696399 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.696731 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.696762 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc3e686b-b5f9-4829-b768-68d16850643e" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.699658 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.709115 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.709102782 podStartE2EDuration="22.709102782s" podCreationTimestamp="2025-11-26 12:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:16:01.707985411 +0000 UTC m=+259.615198763" watchObservedRunningTime="2025-11-26 12:16:01.709102782 +0000 UTC m=+259.616316134" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.737090 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.794103 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.823499 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.826387 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.877551 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 26 12:16:01 crc kubenswrapper[4834]: I1126 12:16:01.915272 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.066532 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.118834 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.129778 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.230900 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.306142 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.349474 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.398204 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.434729 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.453645 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.474203 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.502393 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.530868 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.549903 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.575747 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.587225 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.605336 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.718043 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.741715 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.787791 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 12:16:02 crc kubenswrapper[4834]: I1126 12:16:02.928352 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 26 12:16:03 crc kubenswrapper[4834]: I1126 12:16:03.009295 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 26 12:16:03 crc kubenswrapper[4834]: I1126 12:16:03.244508 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 26 12:16:03 crc kubenswrapper[4834]: I1126 12:16:03.250705 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 26 12:16:03 crc kubenswrapper[4834]: I1126 12:16:03.358660 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 26 12:16:03 crc kubenswrapper[4834]: I1126 12:16:03.489966 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 26 12:16:03 crc kubenswrapper[4834]: I1126 12:16:03.649508 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 26 12:16:03 crc kubenswrapper[4834]: I1126 12:16:03.657014 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 26 12:16:03 crc kubenswrapper[4834]: I1126 12:16:03.707782 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 26 12:16:03 crc kubenswrapper[4834]: I1126 12:16:03.969441 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 26 12:16:04 crc kubenswrapper[4834]: I1126 12:16:04.158048 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 26 12:16:04 crc kubenswrapper[4834]: I1126 12:16:04.184551 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 26 12:16:04 crc kubenswrapper[4834]: I1126 12:16:04.207788 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 12:16:04 crc kubenswrapper[4834]: I1126 12:16:04.312205 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 26 12:16:04 crc kubenswrapper[4834]: I1126 12:16:04.329462 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 26 12:16:04 crc kubenswrapper[4834]: I1126 12:16:04.482910 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 26 12:16:04 crc kubenswrapper[4834]: I1126 12:16:04.859180 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 26 12:16:05 crc kubenswrapper[4834]: I1126 12:16:05.082283 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 12:16:05 crc kubenswrapper[4834]: I1126 12:16:05.480065 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 12:16:06 crc kubenswrapper[4834]: I1126 12:16:06.032892 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 26 12:16:06 crc kubenswrapper[4834]: I1126 12:16:06.348354 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 26 12:16:07 crc kubenswrapper[4834]: I1126 12:16:07.206997 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 12:16:11 crc kubenswrapper[4834]: I1126 12:16:11.820558 4834 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 12:16:11 crc kubenswrapper[4834]: I1126 12:16:11.821001 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://89a6730dd1fedb33e1544044b281e7e9cb541ba590722dd732e6472ad7be0ab4" gracePeriod=5 Nov 26 12:16:14 crc kubenswrapper[4834]: I1126 12:16:14.938066 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.026235 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.026724 4834 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="89a6730dd1fedb33e1544044b281e7e9cb541ba590722dd732e6472ad7be0ab4" exitCode=137 Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.380958 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.381035 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.581571 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.581800 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.581821 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.581858 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.581654 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.581932 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.581888 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.582000 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.581995 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.582136 4834 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.582146 4834 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.582154 4834 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.582162 4834 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.587612 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:16:17 crc kubenswrapper[4834]: I1126 12:16:17.683471 4834 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 12:16:18 crc kubenswrapper[4834]: I1126 12:16:18.032430 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 26 12:16:18 crc kubenswrapper[4834]: I1126 12:16:18.032517 4834 scope.go:117] "RemoveContainer" containerID="89a6730dd1fedb33e1544044b281e7e9cb541ba590722dd732e6472ad7be0ab4" Nov 26 12:16:18 crc kubenswrapper[4834]: I1126 12:16:18.032573 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 12:16:18 crc kubenswrapper[4834]: I1126 12:16:18.421896 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 26 12:16:18 crc kubenswrapper[4834]: I1126 12:16:18.422129 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Nov 26 12:16:18 crc kubenswrapper[4834]: I1126 12:16:18.433933 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 12:16:18 crc kubenswrapper[4834]: I1126 12:16:18.433968 4834 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="318be63a-29a0-4ae9-8c88-9df32b0eefb0" Nov 26 12:16:18 crc kubenswrapper[4834]: I1126 12:16:18.436284 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 12:16:18 crc kubenswrapper[4834]: I1126 12:16:18.436305 4834 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="318be63a-29a0-4ae9-8c88-9df32b0eefb0" Nov 26 12:16:24 crc kubenswrapper[4834]: I1126 12:16:24.061531 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Nov 26 12:16:24 crc kubenswrapper[4834]: I1126 12:16:24.063822 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 26 12:16:24 crc kubenswrapper[4834]: I1126 12:16:24.063868 4834 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a50de1beed62e5ac6da917e036827b5e2ef597f6ca91d13a0ddec39c7ec702bf" exitCode=137 Nov 26 12:16:24 crc kubenswrapper[4834]: I1126 12:16:24.063899 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a50de1beed62e5ac6da917e036827b5e2ef597f6ca91d13a0ddec39c7ec702bf"} Nov 26 12:16:24 crc kubenswrapper[4834]: I1126 12:16:24.063932 4834 scope.go:117] "RemoveContainer" containerID="6fa965756e87e2bc9e0bd69f6b62c87f23a2e21b457abe97a5468d0af30e190e" Nov 26 12:16:25 crc kubenswrapper[4834]: I1126 12:16:25.070295 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Nov 26 12:16:25 crc kubenswrapper[4834]: I1126 12:16:25.071476 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4abca46c79f074e5e404d1f3c3db910afc505a6f69b5ddc559fe2c3da387020a"} Nov 26 12:16:28 crc kubenswrapper[4834]: I1126 12:16:28.771848 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:16:28 crc kubenswrapper[4834]: I1126 12:16:28.853576 4834 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 26 12:16:32 crc kubenswrapper[4834]: I1126 12:16:32.619333 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 12:16:33 crc kubenswrapper[4834]: I1126 12:16:33.781343 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 26 12:16:33 crc kubenswrapper[4834]: I1126 12:16:33.848128 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:16:33 crc kubenswrapper[4834]: I1126 12:16:33.851053 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:16:34 crc kubenswrapper[4834]: I1126 12:16:34.114805 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 12:16:35 crc kubenswrapper[4834]: I1126 12:16:35.116413 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 26 12:16:36 crc kubenswrapper[4834]: I1126 12:16:36.202469 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 26 12:16:41 crc kubenswrapper[4834]: I1126 12:16:41.522617 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.369417 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kkflb"] Nov 26 12:16:42 crc kubenswrapper[4834]: E1126 12:16:42.369895 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" containerName="installer" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.369914 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" containerName="installer" Nov 26 12:16:42 crc kubenswrapper[4834]: E1126 12:16:42.369929 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.369935 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.370016 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a38ab2-63cf-4044-8d03-1d84672f0f10" containerName="installer" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.370035 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.370504 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.385455 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kkflb"] Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.460019 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.554067 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.554256 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-registry-tls\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.554322 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.554352 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4nc\" (UniqueName: \"kubernetes.io/projected/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-kube-api-access-hn4nc\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.554418 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-trusted-ca\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.554474 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.554511 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-bound-sa-token\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.554545 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-registry-certificates\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.595892 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.655805 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-registry-tls\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.655866 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.655885 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4nc\" (UniqueName: \"kubernetes.io/projected/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-kube-api-access-hn4nc\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.655919 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-trusted-ca\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.655968 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-bound-sa-token\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.655994 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-registry-certificates\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.656038 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.657591 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.658231 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-registry-certificates\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.658971 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-trusted-ca\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.663180 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.668654 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-registry-tls\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.676931 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-bound-sa-token\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.677736 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4nc\" (UniqueName: \"kubernetes.io/projected/bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b-kube-api-access-hn4nc\") pod \"image-registry-66df7c8f76-kkflb\" (UID: \"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:42 crc kubenswrapper[4834]: I1126 12:16:42.684636 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:43 crc kubenswrapper[4834]: I1126 12:16:43.067985 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kkflb"] Nov 26 12:16:43 crc kubenswrapper[4834]: I1126 12:16:43.154448 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" event={"ID":"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b","Type":"ContainerStarted","Data":"edc954f6a00eefdec279b479de195d6eb5485e3894f639dc098610d7c4be4c50"} Nov 26 12:16:44 crc kubenswrapper[4834]: I1126 12:16:44.161194 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" event={"ID":"bf44af4b-b5ac-4a28-aa0c-6c351c75bb4b","Type":"ContainerStarted","Data":"e555adb73552eddebcb4989e7e4291869004f57b2a5eacb9047363155449712f"} Nov 26 12:16:44 crc kubenswrapper[4834]: I1126 12:16:44.161678 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:16:44 crc kubenswrapper[4834]: I1126 12:16:44.176554 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" podStartSLOduration=2.176524379 podStartE2EDuration="2.176524379s" podCreationTimestamp="2025-11-26 12:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:16:44.17601153 +0000 UTC m=+302.083224882" watchObservedRunningTime="2025-11-26 12:16:44.176524379 +0000 UTC m=+302.083737730" Nov 26 12:17:02 crc kubenswrapper[4834]: I1126 12:17:02.688959 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-kkflb" Nov 26 12:17:02 crc kubenswrapper[4834]: I1126 12:17:02.731654 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9cvdp"] Nov 26 12:17:21 crc kubenswrapper[4834]: I1126 12:17:21.531385 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:17:21 crc kubenswrapper[4834]: I1126 12:17:21.531882 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:17:27 crc kubenswrapper[4834]: I1126 12:17:27.762217 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" podUID="c5f5d03b-e71e-44d5-96c2-2dd188cd3712" containerName="registry" containerID="cri-o://f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117" gracePeriod=30 Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.058787 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.194466 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-bound-sa-token\") pod \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.194520 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jxsz\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-kube-api-access-7jxsz\") pod \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.194578 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-certificates\") pod \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.194604 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-installation-pull-secrets\") pod \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.194670 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-tls\") pod \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.194725 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-trusted-ca\") pod \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.194843 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-ca-trust-extracted\") pod \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.195063 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\" (UID: \"c5f5d03b-e71e-44d5-96c2-2dd188cd3712\") " Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.196033 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c5f5d03b-e71e-44d5-96c2-2dd188cd3712" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.196073 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c5f5d03b-e71e-44d5-96c2-2dd188cd3712" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.202402 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c5f5d03b-e71e-44d5-96c2-2dd188cd3712" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.203053 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c5f5d03b-e71e-44d5-96c2-2dd188cd3712" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.204715 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c5f5d03b-e71e-44d5-96c2-2dd188cd3712" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.205706 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-kube-api-access-7jxsz" (OuterVolumeSpecName: "kube-api-access-7jxsz") pod "c5f5d03b-e71e-44d5-96c2-2dd188cd3712" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712"). InnerVolumeSpecName "kube-api-access-7jxsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.209237 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c5f5d03b-e71e-44d5-96c2-2dd188cd3712" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.211377 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c5f5d03b-e71e-44d5-96c2-2dd188cd3712" (UID: "c5f5d03b-e71e-44d5-96c2-2dd188cd3712"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.297675 4834 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.297710 4834 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.297722 4834 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.297732 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.297744 4834 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.297753 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.297762 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jxsz\" (UniqueName: \"kubernetes.io/projected/c5f5d03b-e71e-44d5-96c2-2dd188cd3712-kube-api-access-7jxsz\") on node \"crc\" DevicePath \"\"" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.396827 4834 generic.go:334] "Generic (PLEG): container finished" podID="c5f5d03b-e71e-44d5-96c2-2dd188cd3712" containerID="f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117" exitCode=0 Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.396872 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" event={"ID":"c5f5d03b-e71e-44d5-96c2-2dd188cd3712","Type":"ContainerDied","Data":"f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117"} Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.396909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" event={"ID":"c5f5d03b-e71e-44d5-96c2-2dd188cd3712","Type":"ContainerDied","Data":"cea4aff14949b69f6f79d71764e303255564f2c5511256ee62943b5bfceb4641"} Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.396929 4834 scope.go:117] "RemoveContainer" containerID="f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.397050 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9cvdp" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.414629 4834 scope.go:117] "RemoveContainer" containerID="f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117" Nov 26 12:17:28 crc kubenswrapper[4834]: E1126 12:17:28.415146 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117\": container with ID starting with f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117 not found: ID does not exist" containerID="f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.415192 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117"} err="failed to get container status \"f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117\": rpc error: code = NotFound desc = could not find container \"f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117\": container with ID starting with f1c052670b03d36115de3cd28f66145aabd2712ac4cb5e8ad98aa0cbbd5a3117 not found: ID does not exist" Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.426848 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9cvdp"] Nov 26 12:17:28 crc kubenswrapper[4834]: I1126 12:17:28.426885 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9cvdp"] Nov 26 12:17:30 crc kubenswrapper[4834]: I1126 12:17:30.423161 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f5d03b-e71e-44d5-96c2-2dd188cd3712" path="/var/lib/kubelet/pods/c5f5d03b-e71e-44d5-96c2-2dd188cd3712/volumes" Nov 26 12:17:51 crc kubenswrapper[4834]: I1126 12:17:51.531830 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:17:51 crc kubenswrapper[4834]: I1126 12:17:51.532557 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:18:21 crc kubenswrapper[4834]: I1126 12:18:21.531388 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:18:21 crc kubenswrapper[4834]: I1126 12:18:21.532059 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:18:21 crc kubenswrapper[4834]: I1126 12:18:21.532111 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:18:21 crc kubenswrapper[4834]: I1126 12:18:21.532733 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f165beb62e23145bceea6aef6fcd553d3b72544fd470871ee30eb868544a4707"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:18:21 crc kubenswrapper[4834]: I1126 12:18:21.532792 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://f165beb62e23145bceea6aef6fcd553d3b72544fd470871ee30eb868544a4707" gracePeriod=600 Nov 26 12:18:21 crc kubenswrapper[4834]: I1126 12:18:21.678182 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="f165beb62e23145bceea6aef6fcd553d3b72544fd470871ee30eb868544a4707" exitCode=0 Nov 26 12:18:21 crc kubenswrapper[4834]: I1126 12:18:21.678282 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"f165beb62e23145bceea6aef6fcd553d3b72544fd470871ee30eb868544a4707"} Nov 26 12:18:21 crc kubenswrapper[4834]: I1126 12:18:21.678537 4834 scope.go:117] "RemoveContainer" containerID="850dc2e685b745be4326f5dcd67f2b05092d3a063b76779cd951a830559aa6ef" Nov 26 12:18:22 crc kubenswrapper[4834]: I1126 12:18:22.686701 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"758813397b9c2c2ccad432b9cb11a0273141f48fa83e1aaad525d33be881d337"} Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.121251 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-j8kwl"] Nov 26 12:20:10 crc kubenswrapper[4834]: E1126 12:20:10.121938 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f5d03b-e71e-44d5-96c2-2dd188cd3712" containerName="registry" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.121950 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f5d03b-e71e-44d5-96c2-2dd188cd3712" containerName="registry" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.122069 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f5d03b-e71e-44d5-96c2-2dd188cd3712" containerName="registry" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.122433 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-j8kwl" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.123713 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-6cl2q"] Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.123801 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jc5xn" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.124233 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-6cl2q" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.127098 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ql4fg"] Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.127676 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.134502 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.134708 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.136969 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2zn2z" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.137353 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-xg7qv" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.140844 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-j8kwl"] Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.148645 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ql4fg"] Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.152008 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-6cl2q"] Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.152353 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffnql\" (UniqueName: \"kubernetes.io/projected/d1a808a5-cb4c-4edf-b738-cc825cc68a1a-kube-api-access-ffnql\") pod \"cert-manager-cainjector-7f985d654d-j8kwl\" (UID: \"d1a808a5-cb4c-4edf-b738-cc825cc68a1a\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-j8kwl" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.152411 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkgfc\" (UniqueName: \"kubernetes.io/projected/a2c348a1-3019-4a84-bd2a-416c7be748ea-kube-api-access-fkgfc\") pod \"cert-manager-webhook-5655c58dd6-ql4fg\" (UID: \"a2c348a1-3019-4a84-bd2a-416c7be748ea\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.152559 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmpx4\" (UniqueName: \"kubernetes.io/projected/576cb695-a9de-4c83-a7a5-0727a9b6899d-kube-api-access-bmpx4\") pod \"cert-manager-5b446d88c5-6cl2q\" (UID: \"576cb695-a9de-4c83-a7a5-0727a9b6899d\") " pod="cert-manager/cert-manager-5b446d88c5-6cl2q" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.253753 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffnql\" (UniqueName: \"kubernetes.io/projected/d1a808a5-cb4c-4edf-b738-cc825cc68a1a-kube-api-access-ffnql\") pod \"cert-manager-cainjector-7f985d654d-j8kwl\" (UID: \"d1a808a5-cb4c-4edf-b738-cc825cc68a1a\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-j8kwl" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.253819 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkgfc\" (UniqueName: \"kubernetes.io/projected/a2c348a1-3019-4a84-bd2a-416c7be748ea-kube-api-access-fkgfc\") pod \"cert-manager-webhook-5655c58dd6-ql4fg\" (UID: \"a2c348a1-3019-4a84-bd2a-416c7be748ea\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.253879 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmpx4\" (UniqueName: \"kubernetes.io/projected/576cb695-a9de-4c83-a7a5-0727a9b6899d-kube-api-access-bmpx4\") pod \"cert-manager-5b446d88c5-6cl2q\" (UID: \"576cb695-a9de-4c83-a7a5-0727a9b6899d\") " pod="cert-manager/cert-manager-5b446d88c5-6cl2q" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.268919 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkgfc\" (UniqueName: \"kubernetes.io/projected/a2c348a1-3019-4a84-bd2a-416c7be748ea-kube-api-access-fkgfc\") pod \"cert-manager-webhook-5655c58dd6-ql4fg\" (UID: \"a2c348a1-3019-4a84-bd2a-416c7be748ea\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.269020 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmpx4\" (UniqueName: \"kubernetes.io/projected/576cb695-a9de-4c83-a7a5-0727a9b6899d-kube-api-access-bmpx4\") pod \"cert-manager-5b446d88c5-6cl2q\" (UID: \"576cb695-a9de-4c83-a7a5-0727a9b6899d\") " pod="cert-manager/cert-manager-5b446d88c5-6cl2q" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.269492 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffnql\" (UniqueName: \"kubernetes.io/projected/d1a808a5-cb4c-4edf-b738-cc825cc68a1a-kube-api-access-ffnql\") pod \"cert-manager-cainjector-7f985d654d-j8kwl\" (UID: \"d1a808a5-cb4c-4edf-b738-cc825cc68a1a\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-j8kwl" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.451748 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-j8kwl" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.458884 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-6cl2q" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.464650 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.816843 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ql4fg"] Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.827145 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.843634 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-j8kwl"] Nov 26 12:20:10 crc kubenswrapper[4834]: W1126 12:20:10.845710 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a808a5_cb4c_4edf_b738_cc825cc68a1a.slice/crio-5c7742681e7db29c5b3dd321a518f2baae76dcb483a3eef59c19e4a95fba09b9 WatchSource:0}: Error finding container 5c7742681e7db29c5b3dd321a518f2baae76dcb483a3eef59c19e4a95fba09b9: Status 404 returned error can't find the container with id 5c7742681e7db29c5b3dd321a518f2baae76dcb483a3eef59c19e4a95fba09b9 Nov 26 12:20:10 crc kubenswrapper[4834]: I1126 12:20:10.846521 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-6cl2q"] Nov 26 12:20:10 crc kubenswrapper[4834]: W1126 12:20:10.850332 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod576cb695_a9de_4c83_a7a5_0727a9b6899d.slice/crio-3a014c787f85a3b303678d39ff0db32ac9353f49d7050ebe4a1bdc5427f33c4a WatchSource:0}: Error finding container 3a014c787f85a3b303678d39ff0db32ac9353f49d7050ebe4a1bdc5427f33c4a: Status 404 returned error can't find the container with id 3a014c787f85a3b303678d39ff0db32ac9353f49d7050ebe4a1bdc5427f33c4a Nov 26 12:20:11 crc kubenswrapper[4834]: I1126 12:20:11.183453 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-6cl2q" event={"ID":"576cb695-a9de-4c83-a7a5-0727a9b6899d","Type":"ContainerStarted","Data":"3a014c787f85a3b303678d39ff0db32ac9353f49d7050ebe4a1bdc5427f33c4a"} Nov 26 12:20:11 crc kubenswrapper[4834]: I1126 12:20:11.184357 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" event={"ID":"a2c348a1-3019-4a84-bd2a-416c7be748ea","Type":"ContainerStarted","Data":"e3a27acdc0330db988b7d6b96ee81ddf9a33ae48076f8dee299d27fae7a9e69e"} Nov 26 12:20:11 crc kubenswrapper[4834]: I1126 12:20:11.185109 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-j8kwl" event={"ID":"d1a808a5-cb4c-4edf-b738-cc825cc68a1a","Type":"ContainerStarted","Data":"5c7742681e7db29c5b3dd321a518f2baae76dcb483a3eef59c19e4a95fba09b9"} Nov 26 12:20:14 crc kubenswrapper[4834]: I1126 12:20:14.200284 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-6cl2q" event={"ID":"576cb695-a9de-4c83-a7a5-0727a9b6899d","Type":"ContainerStarted","Data":"e907215fa3fe5a69779de256f6491edf677a527684a58e66f315609c22d8450a"} Nov 26 12:20:14 crc kubenswrapper[4834]: I1126 12:20:14.201848 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" event={"ID":"a2c348a1-3019-4a84-bd2a-416c7be748ea","Type":"ContainerStarted","Data":"4b6afa89655e748e5278efb43283449658bf40a5b9fa1ab0b997b5f47272ffb7"} Nov 26 12:20:14 crc kubenswrapper[4834]: I1126 12:20:14.202012 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" Nov 26 12:20:14 crc kubenswrapper[4834]: I1126 12:20:14.202951 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-j8kwl" event={"ID":"d1a808a5-cb4c-4edf-b738-cc825cc68a1a","Type":"ContainerStarted","Data":"fd8ac3d85d6447c01ad47762cb827f4c8d9f18596958a0106bfc7fac45684505"} Nov 26 12:20:14 crc kubenswrapper[4834]: I1126 12:20:14.212892 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-6cl2q" podStartSLOduration=1.538667512 podStartE2EDuration="4.212881962s" podCreationTimestamp="2025-11-26 12:20:10 +0000 UTC" firstStartedPulling="2025-11-26 12:20:10.851951106 +0000 UTC m=+508.759164459" lastFinishedPulling="2025-11-26 12:20:13.526165557 +0000 UTC m=+511.433378909" observedRunningTime="2025-11-26 12:20:14.211119706 +0000 UTC m=+512.118333058" watchObservedRunningTime="2025-11-26 12:20:14.212881962 +0000 UTC m=+512.120095314" Nov 26 12:20:14 crc kubenswrapper[4834]: I1126 12:20:14.239576 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-j8kwl" podStartSLOduration=1.556173753 podStartE2EDuration="4.239558938s" podCreationTimestamp="2025-11-26 12:20:10 +0000 UTC" firstStartedPulling="2025-11-26 12:20:10.847892148 +0000 UTC m=+508.755105499" lastFinishedPulling="2025-11-26 12:20:13.531277332 +0000 UTC m=+511.438490684" observedRunningTime="2025-11-26 12:20:14.237106319 +0000 UTC m=+512.144319672" watchObservedRunningTime="2025-11-26 12:20:14.239558938 +0000 UTC m=+512.146772289" Nov 26 12:20:14 crc kubenswrapper[4834]: I1126 12:20:14.239836 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" podStartSLOduration=1.535663074 podStartE2EDuration="4.23983025s" podCreationTimestamp="2025-11-26 12:20:10 +0000 UTC" firstStartedPulling="2025-11-26 12:20:10.826282975 +0000 UTC m=+508.733496326" lastFinishedPulling="2025-11-26 12:20:13.530450151 +0000 UTC m=+511.437663502" observedRunningTime="2025-11-26 12:20:14.226584186 +0000 UTC m=+512.133797538" watchObservedRunningTime="2025-11-26 12:20:14.23983025 +0000 UTC m=+512.147043592" Nov 26 12:20:20 crc kubenswrapper[4834]: I1126 12:20:20.467097 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-ql4fg" Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.531925 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.532384 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.701294 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9dvt4"] Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.701657 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovn-controller" containerID="cri-o://a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443" gracePeriod=30 Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.701723 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="nbdb" containerID="cri-o://44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c" gracePeriod=30 Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.701814 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="sbdb" containerID="cri-o://913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589" gracePeriod=30 Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.701809 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49" gracePeriod=30 Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.701878 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="northd" containerID="cri-o://7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087" gracePeriod=30 Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.701926 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kube-rbac-proxy-node" containerID="cri-o://fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a" gracePeriod=30 Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.701989 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovn-acl-logging" containerID="cri-o://8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b" gracePeriod=30 Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.731836 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" containerID="cri-o://a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4" gracePeriod=30 Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.968352 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/3.log" Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.971510 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovn-acl-logging/0.log" Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.972060 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovn-controller/0.log" Nov 26 12:20:21 crc kubenswrapper[4834]: I1126 12:20:21.972878 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020203 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-62tqg"] Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020771 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kubecfg-setup" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020790 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kubecfg-setup" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020800 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020807 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020814 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020820 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020828 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="nbdb" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020833 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="nbdb" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020840 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020845 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020852 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="northd" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020858 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="northd" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020865 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020871 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020882 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="sbdb" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020888 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="sbdb" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020896 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovn-acl-logging" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020902 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovn-acl-logging" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020908 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kube-rbac-proxy-node" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020913 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kube-rbac-proxy-node" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.020919 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovn-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.020924 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovn-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021002 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021011 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021016 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="northd" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021025 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kube-rbac-proxy-node" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021031 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021037 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021043 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="sbdb" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021050 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovn-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021057 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovn-acl-logging" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021062 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="nbdb" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.021133 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021140 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.021148 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021154 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021222 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.021380 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerName="ovnkube-controller" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.022538 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.081981 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-script-lib\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082023 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-log-socket\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082050 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-slash\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082075 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-systemd\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082092 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-systemd-units\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082110 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082147 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-slash" (OuterVolumeSpecName: "host-slash") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082178 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-node-log\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082234 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-ovn\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082202 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-node-log" (OuterVolumeSpecName: "node-log") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082286 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-env-overrides\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082356 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-etc-openvswitch\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082381 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-var-lib-openvswitch\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082404 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-netns\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082423 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plkpv\" (UniqueName: \"kubernetes.io/projected/e7f44620-97b4-4cdb-8252-d8a2971830fa-kube-api-access-plkpv\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082295 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-log-socket" (OuterVolumeSpecName: "log-socket") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082220 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082235 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082302 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082487 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082430 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082508 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082531 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082470 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-netd\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082623 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082701 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082738 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-config\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.082798 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-ovn-kubernetes\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083067 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083121 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083170 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovn-node-metrics-cert\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083192 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-kubelet\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083214 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-bin\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083240 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-openvswitch\") pod \"e7f44620-97b4-4cdb-8252-d8a2971830fa\" (UID: \"e7f44620-97b4-4cdb-8252-d8a2971830fa\") " Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083486 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-var-lib-openvswitch\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083516 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-cni-bin\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083538 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-slash\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083559 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-etc-openvswitch\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083573 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083589 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-kubelet\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083614 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-env-overrides\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083626 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083635 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-systemd-units\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083666 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-run-ovn\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083671 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083686 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-ovn-node-metrics-cert\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083748 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-run-systemd\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083813 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-cni-netd\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083887 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-run-netns\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083932 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flpbw\" (UniqueName: \"kubernetes.io/projected/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-kube-api-access-flpbw\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.083979 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-run-openvswitch\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084007 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-ovnkube-script-lib\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084031 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-node-log\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084053 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-run-ovn-kubernetes\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084076 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-log-socket\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084097 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-ovnkube-config\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084173 4834 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-node-log\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084192 4834 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084203 4834 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084213 4834 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084223 4834 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084231 4834 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084239 4834 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084247 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084256 4834 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084265 4834 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084273 4834 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084282 4834 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084290 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084297 4834 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-log-socket\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084305 4834 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-slash\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084333 4834 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.084343 4834 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.088427 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.088866 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f44620-97b4-4cdb-8252-d8a2971830fa-kube-api-access-plkpv" (OuterVolumeSpecName: "kube-api-access-plkpv") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "kube-api-access-plkpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.094787 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e7f44620-97b4-4cdb-8252-d8a2971830fa" (UID: "e7f44620-97b4-4cdb-8252-d8a2971830fa"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185718 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-run-ovn\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185777 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-ovn-node-metrics-cert\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185806 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-run-systemd\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185828 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-cni-netd\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185856 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-run-netns\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185876 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flpbw\" (UniqueName: \"kubernetes.io/projected/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-kube-api-access-flpbw\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185909 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-run-openvswitch\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185926 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-ovnkube-script-lib\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185946 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-node-log\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185963 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-run-ovn-kubernetes\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185980 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-log-socket\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.185997 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186013 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-ovnkube-config\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186032 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-cni-bin\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186048 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-var-lib-openvswitch\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186065 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-slash\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186085 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-etc-openvswitch\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186111 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-kubelet\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186126 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-env-overrides\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186147 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-systemd-units\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186181 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plkpv\" (UniqueName: \"kubernetes.io/projected/e7f44620-97b4-4cdb-8252-d8a2971830fa-kube-api-access-plkpv\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186193 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7f44620-97b4-4cdb-8252-d8a2971830fa-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186204 4834 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7f44620-97b4-4cdb-8252-d8a2971830fa-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186252 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-systemd-units\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186291 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-node-log\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186328 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-run-ovn-kubernetes\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186348 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-log-socket\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186369 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186784 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-ovnkube-script-lib\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186843 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-run-ovn\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186913 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-ovnkube-config\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186946 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-cni-bin\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186971 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-var-lib-openvswitch\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.186992 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-slash\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.187013 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-etc-openvswitch\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.187031 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-kubelet\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.187176 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-run-netns\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.187232 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-run-systemd\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.187260 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-host-cni-netd\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.187346 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-env-overrides\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.187382 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-run-openvswitch\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.189924 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-ovn-node-metrics-cert\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.200976 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flpbw\" (UniqueName: \"kubernetes.io/projected/5c8e9e0c-7ab0-4622-9e0d-012a27d59108-kube-api-access-flpbw\") pod \"ovnkube-node-62tqg\" (UID: \"5c8e9e0c-7ab0-4622-9e0d-012a27d59108\") " pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.237654 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovnkube-controller/3.log" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.239645 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovn-acl-logging/0.log" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240026 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9dvt4_e7f44620-97b4-4cdb-8252-d8a2971830fa/ovn-controller/0.log" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240368 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4" exitCode=0 Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240391 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589" exitCode=0 Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240398 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c" exitCode=0 Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240406 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087" exitCode=0 Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240411 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49" exitCode=0 Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240417 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a" exitCode=0 Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240422 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b" exitCode=143 Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240428 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7f44620-97b4-4cdb-8252-d8a2971830fa" containerID="a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443" exitCode=143 Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240454 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240467 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240516 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240538 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240547 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240558 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240566 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240576 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240586 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240591 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240598 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240602 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240608 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240613 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240618 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240624 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240630 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240638 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240644 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240649 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240654 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240660 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240665 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240670 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240675 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240680 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240684 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240690 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240697 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240702 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240707 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240711 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240715 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240714 4834 scope.go:117] "RemoveContainer" containerID="a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240720 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240805 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240817 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240822 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240827 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240844 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9dvt4" event={"ID":"e7f44620-97b4-4cdb-8252-d8a2971830fa","Type":"ContainerDied","Data":"c47fc0dcaffee3586df9083cc357f917edd800c7f47f55a69389b47b37ffef17"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240863 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240869 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240875 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240880 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240885 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240890 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240897 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240902 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240906 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.240911 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.241734 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k8hjt_234b786b-76dd-4238-81bd-a743042bece9/kube-multus/1.log" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.242076 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k8hjt_234b786b-76dd-4238-81bd-a743042bece9/kube-multus/0.log" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.242111 4834 generic.go:334] "Generic (PLEG): container finished" podID="234b786b-76dd-4238-81bd-a743042bece9" containerID="516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f" exitCode=2 Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.242138 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k8hjt" event={"ID":"234b786b-76dd-4238-81bd-a743042bece9","Type":"ContainerDied","Data":"516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.242156 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea"} Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.242484 4834 scope.go:117] "RemoveContainer" containerID="516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.242692 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-k8hjt_openshift-multus(234b786b-76dd-4238-81bd-a743042bece9)\"" pod="openshift-multus/multus-k8hjt" podUID="234b786b-76dd-4238-81bd-a743042bece9" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.254202 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.273302 4834 scope.go:117] "RemoveContainer" containerID="913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.288836 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9dvt4"] Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.294682 4834 scope.go:117] "RemoveContainer" containerID="44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.294785 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9dvt4"] Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.303648 4834 scope.go:117] "RemoveContainer" containerID="7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.312084 4834 scope.go:117] "RemoveContainer" containerID="5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.319914 4834 scope.go:117] "RemoveContainer" containerID="fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.329339 4834 scope.go:117] "RemoveContainer" containerID="8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.334489 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.337339 4834 scope.go:117] "RemoveContainer" containerID="a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.346048 4834 scope.go:117] "RemoveContainer" containerID="feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.355890 4834 scope.go:117] "RemoveContainer" containerID="a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.356216 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": container with ID starting with a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4 not found: ID does not exist" containerID="a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.356254 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} err="failed to get container status \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": rpc error: code = NotFound desc = could not find container \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": container with ID starting with a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.356282 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.356663 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\": container with ID starting with 049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755 not found: ID does not exist" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.356695 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} err="failed to get container status \"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\": rpc error: code = NotFound desc = could not find container \"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\": container with ID starting with 049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.356725 4834 scope.go:117] "RemoveContainer" containerID="913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.356979 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\": container with ID starting with 913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589 not found: ID does not exist" containerID="913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.357008 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} err="failed to get container status \"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\": rpc error: code = NotFound desc = could not find container \"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\": container with ID starting with 913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.357025 4834 scope.go:117] "RemoveContainer" containerID="44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.357233 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\": container with ID starting with 44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c not found: ID does not exist" containerID="44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.357268 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} err="failed to get container status \"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\": rpc error: code = NotFound desc = could not find container \"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\": container with ID starting with 44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.357293 4834 scope.go:117] "RemoveContainer" containerID="7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.357521 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\": container with ID starting with 7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087 not found: ID does not exist" containerID="7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.357549 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} err="failed to get container status \"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\": rpc error: code = NotFound desc = could not find container \"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\": container with ID starting with 7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.357565 4834 scope.go:117] "RemoveContainer" containerID="5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.357762 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\": container with ID starting with 5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49 not found: ID does not exist" containerID="5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.357784 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} err="failed to get container status \"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\": rpc error: code = NotFound desc = could not find container \"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\": container with ID starting with 5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.357796 4834 scope.go:117] "RemoveContainer" containerID="fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.357968 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\": container with ID starting with fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a not found: ID does not exist" containerID="fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.358022 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} err="failed to get container status \"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\": rpc error: code = NotFound desc = could not find container \"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\": container with ID starting with fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.358034 4834 scope.go:117] "RemoveContainer" containerID="8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.358216 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\": container with ID starting with 8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b not found: ID does not exist" containerID="8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.358248 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} err="failed to get container status \"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\": rpc error: code = NotFound desc = could not find container \"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\": container with ID starting with 8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.358304 4834 scope.go:117] "RemoveContainer" containerID="a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.358492 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\": container with ID starting with a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443 not found: ID does not exist" containerID="a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.358508 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} err="failed to get container status \"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\": rpc error: code = NotFound desc = could not find container \"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\": container with ID starting with a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.358538 4834 scope.go:117] "RemoveContainer" containerID="feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859" Nov 26 12:20:22 crc kubenswrapper[4834]: E1126 12:20:22.358816 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\": container with ID starting with feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859 not found: ID does not exist" containerID="feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.358830 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859"} err="failed to get container status \"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\": rpc error: code = NotFound desc = could not find container \"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\": container with ID starting with feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.358843 4834 scope.go:117] "RemoveContainer" containerID="a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.359042 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} err="failed to get container status \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": rpc error: code = NotFound desc = could not find container \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": container with ID starting with a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.359057 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.359607 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} err="failed to get container status \"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\": rpc error: code = NotFound desc = could not find container \"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\": container with ID starting with 049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.359631 4834 scope.go:117] "RemoveContainer" containerID="913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.359857 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} err="failed to get container status \"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\": rpc error: code = NotFound desc = could not find container \"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\": container with ID starting with 913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.359897 4834 scope.go:117] "RemoveContainer" containerID="44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.360173 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} err="failed to get container status \"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\": rpc error: code = NotFound desc = could not find container \"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\": container with ID starting with 44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.360194 4834 scope.go:117] "RemoveContainer" containerID="7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.360402 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} err="failed to get container status \"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\": rpc error: code = NotFound desc = could not find container \"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\": container with ID starting with 7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.360452 4834 scope.go:117] "RemoveContainer" containerID="5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.360681 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} err="failed to get container status \"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\": rpc error: code = NotFound desc = could not find container \"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\": container with ID starting with 5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.360704 4834 scope.go:117] "RemoveContainer" containerID="fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.360929 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} err="failed to get container status \"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\": rpc error: code = NotFound desc = could not find container \"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\": container with ID starting with fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.360950 4834 scope.go:117] "RemoveContainer" containerID="8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.361412 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} err="failed to get container status \"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\": rpc error: code = NotFound desc = could not find container \"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\": container with ID starting with 8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.361456 4834 scope.go:117] "RemoveContainer" containerID="a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.362861 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} err="failed to get container status \"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\": rpc error: code = NotFound desc = could not find container \"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\": container with ID starting with a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.362900 4834 scope.go:117] "RemoveContainer" containerID="feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.363118 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859"} err="failed to get container status \"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\": rpc error: code = NotFound desc = could not find container \"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\": container with ID starting with feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.363143 4834 scope.go:117] "RemoveContainer" containerID="a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.363387 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} err="failed to get container status \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": rpc error: code = NotFound desc = could not find container \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": container with ID starting with a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.363452 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.363665 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} err="failed to get container status \"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\": rpc error: code = NotFound desc = could not find container \"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\": container with ID starting with 049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.363687 4834 scope.go:117] "RemoveContainer" containerID="913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.363870 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} err="failed to get container status \"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\": rpc error: code = NotFound desc = could not find container \"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\": container with ID starting with 913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.363892 4834 scope.go:117] "RemoveContainer" containerID="44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.364096 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} err="failed to get container status \"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\": rpc error: code = NotFound desc = could not find container \"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\": container with ID starting with 44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.364118 4834 scope.go:117] "RemoveContainer" containerID="7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.364330 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} err="failed to get container status \"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\": rpc error: code = NotFound desc = could not find container \"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\": container with ID starting with 7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.364351 4834 scope.go:117] "RemoveContainer" containerID="5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.364543 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} err="failed to get container status \"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\": rpc error: code = NotFound desc = could not find container \"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\": container with ID starting with 5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.364564 4834 scope.go:117] "RemoveContainer" containerID="fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.364787 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} err="failed to get container status \"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\": rpc error: code = NotFound desc = could not find container \"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\": container with ID starting with fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.364814 4834 scope.go:117] "RemoveContainer" containerID="8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.364984 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} err="failed to get container status \"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\": rpc error: code = NotFound desc = could not find container \"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\": container with ID starting with 8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.365004 4834 scope.go:117] "RemoveContainer" containerID="a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.365220 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} err="failed to get container status \"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\": rpc error: code = NotFound desc = could not find container \"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\": container with ID starting with a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.365242 4834 scope.go:117] "RemoveContainer" containerID="feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.365451 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859"} err="failed to get container status \"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\": rpc error: code = NotFound desc = could not find container \"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\": container with ID starting with feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.365479 4834 scope.go:117] "RemoveContainer" containerID="a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.365978 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} err="failed to get container status \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": rpc error: code = NotFound desc = could not find container \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": container with ID starting with a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.366020 4834 scope.go:117] "RemoveContainer" containerID="049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.366268 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755"} err="failed to get container status \"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\": rpc error: code = NotFound desc = could not find container \"049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755\": container with ID starting with 049a1a4974a11e319c0453dce00ac622630a5932d35bbedf3bddcd4cecee8755 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.366293 4834 scope.go:117] "RemoveContainer" containerID="913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.366534 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589"} err="failed to get container status \"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\": rpc error: code = NotFound desc = could not find container \"913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589\": container with ID starting with 913194e7d8b997d20e61b9b966757be572c5ddb47865456e5059af786b54e589 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.366575 4834 scope.go:117] "RemoveContainer" containerID="44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.366752 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c"} err="failed to get container status \"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\": rpc error: code = NotFound desc = could not find container \"44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c\": container with ID starting with 44212f641dfcb653e2b4f6d155371d16c166ba258ff047778bbbdc7d74c03e1c not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.366773 4834 scope.go:117] "RemoveContainer" containerID="7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.366952 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087"} err="failed to get container status \"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\": rpc error: code = NotFound desc = could not find container \"7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087\": container with ID starting with 7bea2109741131800dc02a2ecdc209d1b59f8988fcad24ef4c3303b9918be087 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.366976 4834 scope.go:117] "RemoveContainer" containerID="5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.367178 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49"} err="failed to get container status \"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\": rpc error: code = NotFound desc = could not find container \"5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49\": container with ID starting with 5d805d85b18db0c283033ed4ca225e85197e75957853f744af0f7325d6763a49 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.367202 4834 scope.go:117] "RemoveContainer" containerID="fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.367432 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a"} err="failed to get container status \"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\": rpc error: code = NotFound desc = could not find container \"fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a\": container with ID starting with fd2b03ef385e758901b1443106dc8c0278875b1752aeeaf5fb08d48e0e97d00a not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.367462 4834 scope.go:117] "RemoveContainer" containerID="8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.367709 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b"} err="failed to get container status \"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\": rpc error: code = NotFound desc = could not find container \"8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b\": container with ID starting with 8f1d4b4324c672ffa653d5a6d08ebe4c1a4ce9b6e7366dd4bff44efaeb0f8f9b not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.367728 4834 scope.go:117] "RemoveContainer" containerID="a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.367967 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443"} err="failed to get container status \"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\": rpc error: code = NotFound desc = could not find container \"a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443\": container with ID starting with a9d14c13e6c38af44ee5218af37dce813eaf0e1094f337a86a88c9f81df60443 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.367983 4834 scope.go:117] "RemoveContainer" containerID="feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.368216 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859"} err="failed to get container status \"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\": rpc error: code = NotFound desc = could not find container \"feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859\": container with ID starting with feba6e25271e49115572b7eb83337893c4912d705cf2ff39909d1bac22ad8859 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.368233 4834 scope.go:117] "RemoveContainer" containerID="a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.368486 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4"} err="failed to get container status \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": rpc error: code = NotFound desc = could not find container \"a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4\": container with ID starting with a009bef1355d4d74e54c4912f85f4e153e2bf42dcbd40b0b967fdad72b1e9be4 not found: ID does not exist" Nov 26 12:20:22 crc kubenswrapper[4834]: I1126 12:20:22.427451 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f44620-97b4-4cdb-8252-d8a2971830fa" path="/var/lib/kubelet/pods/e7f44620-97b4-4cdb-8252-d8a2971830fa/volumes" Nov 26 12:20:23 crc kubenswrapper[4834]: I1126 12:20:23.249797 4834 generic.go:334] "Generic (PLEG): container finished" podID="5c8e9e0c-7ab0-4622-9e0d-012a27d59108" containerID="3db6e3b665d331ff20b23a5914ea68030a0765dc8aa4149da5c0021386d994cd" exitCode=0 Nov 26 12:20:23 crc kubenswrapper[4834]: I1126 12:20:23.249874 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerDied","Data":"3db6e3b665d331ff20b23a5914ea68030a0765dc8aa4149da5c0021386d994cd"} Nov 26 12:20:23 crc kubenswrapper[4834]: I1126 12:20:23.250093 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerStarted","Data":"a53fe8fbda6cfe4f2c12109280af7e6352f32e6b0fcac5b14c5e78059fa826df"} Nov 26 12:20:24 crc kubenswrapper[4834]: I1126 12:20:24.257220 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerStarted","Data":"884ff7d99fffc9c32e894e9c78d77c02158df812d5ad713464a0ef2e2f653ca1"} Nov 26 12:20:24 crc kubenswrapper[4834]: I1126 12:20:24.257533 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerStarted","Data":"437d5b415f8c77edba839bf3baf17448994f6e33a8015a7f3bce89b19196fa1d"} Nov 26 12:20:24 crc kubenswrapper[4834]: I1126 12:20:24.257544 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerStarted","Data":"9ed5e302657800068674409409e7b2ead8c5f06ebd1a2f11a434cdfef8a047b1"} Nov 26 12:20:24 crc kubenswrapper[4834]: I1126 12:20:24.257554 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerStarted","Data":"0243fac9f064b36e0a8beb19166b73f4592de09d3385afd6da274f8335000d95"} Nov 26 12:20:24 crc kubenswrapper[4834]: I1126 12:20:24.257563 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerStarted","Data":"63b9c06c285201449dccf82e23ab235d60411b8b986efe4ad41ad572b4070a8a"} Nov 26 12:20:24 crc kubenswrapper[4834]: I1126 12:20:24.257593 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerStarted","Data":"4e0d77a5a0e64839424b6c3d22a736059f3af91f8a2862268ed535aa40927ebc"} Nov 26 12:20:26 crc kubenswrapper[4834]: I1126 12:20:26.271688 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerStarted","Data":"709e99b6dbbb74098d257f48232e12d7f97313ecf33e93f02cebac3e28d36e13"} Nov 26 12:20:28 crc kubenswrapper[4834]: I1126 12:20:28.287949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" event={"ID":"5c8e9e0c-7ab0-4622-9e0d-012a27d59108","Type":"ContainerStarted","Data":"a33d98a48d4a9e8ae079b052efb6cdb10fd93cc9507347455cafc66a9755af63"} Nov 26 12:20:28 crc kubenswrapper[4834]: I1126 12:20:28.288447 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:28 crc kubenswrapper[4834]: I1126 12:20:28.288461 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:28 crc kubenswrapper[4834]: I1126 12:20:28.319234 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:28 crc kubenswrapper[4834]: I1126 12:20:28.334386 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" podStartSLOduration=6.334371056 podStartE2EDuration="6.334371056s" podCreationTimestamp="2025-11-26 12:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:20:28.327350022 +0000 UTC m=+526.234563373" watchObservedRunningTime="2025-11-26 12:20:28.334371056 +0000 UTC m=+526.241584409" Nov 26 12:20:29 crc kubenswrapper[4834]: I1126 12:20:29.292192 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:29 crc kubenswrapper[4834]: I1126 12:20:29.315294 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:35 crc kubenswrapper[4834]: I1126 12:20:35.417304 4834 scope.go:117] "RemoveContainer" containerID="516605244a77487311ed89cb7e25822e4283eae057abab1bf90158947e45584f" Nov 26 12:20:36 crc kubenswrapper[4834]: I1126 12:20:36.335279 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k8hjt_234b786b-76dd-4238-81bd-a743042bece9/kube-multus/1.log" Nov 26 12:20:36 crc kubenswrapper[4834]: I1126 12:20:36.336248 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k8hjt_234b786b-76dd-4238-81bd-a743042bece9/kube-multus/0.log" Nov 26 12:20:36 crc kubenswrapper[4834]: I1126 12:20:36.336337 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k8hjt" event={"ID":"234b786b-76dd-4238-81bd-a743042bece9","Type":"ContainerStarted","Data":"6783a48085ce02b87e75e0294d387cf03b524fa7de8831fbde34ae640473582c"} Nov 26 12:20:42 crc kubenswrapper[4834]: I1126 12:20:42.584147 4834 scope.go:117] "RemoveContainer" containerID="a981ea47a64ebef1d8328e6dd8b51c08efc171ef578b638a0ddd27339dc202ea" Nov 26 12:20:43 crc kubenswrapper[4834]: I1126 12:20:43.372573 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k8hjt_234b786b-76dd-4238-81bd-a743042bece9/kube-multus/1.log" Nov 26 12:20:51 crc kubenswrapper[4834]: I1126 12:20:51.531235 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:20:51 crc kubenswrapper[4834]: I1126 12:20:51.532065 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:20:52 crc kubenswrapper[4834]: I1126 12:20:52.359643 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-62tqg" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.138860 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42"] Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.139971 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.141065 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.145683 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42"] Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.156673 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.156720 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbkr9\" (UniqueName: \"kubernetes.io/projected/c81bac3a-9953-4132-9869-f1f12d228844-kube-api-access-mbkr9\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.156770 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.257798 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.257847 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbkr9\" (UniqueName: \"kubernetes.io/projected/c81bac3a-9953-4132-9869-f1f12d228844-kube-api-access-mbkr9\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.257877 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.258280 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.258353 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.274723 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbkr9\" (UniqueName: \"kubernetes.io/projected/c81bac3a-9953-4132-9869-f1f12d228844-kube-api-access-mbkr9\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.452378 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:53 crc kubenswrapper[4834]: I1126 12:20:53.797556 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42"] Nov 26 12:20:54 crc kubenswrapper[4834]: I1126 12:20:54.435058 4834 generic.go:334] "Generic (PLEG): container finished" podID="c81bac3a-9953-4132-9869-f1f12d228844" containerID="f01df722cfe73c32b55e80e0ea20ced06182ed23df52e9c48a1a16f0d2b7cac5" exitCode=0 Nov 26 12:20:54 crc kubenswrapper[4834]: I1126 12:20:54.435264 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" event={"ID":"c81bac3a-9953-4132-9869-f1f12d228844","Type":"ContainerDied","Data":"f01df722cfe73c32b55e80e0ea20ced06182ed23df52e9c48a1a16f0d2b7cac5"} Nov 26 12:20:54 crc kubenswrapper[4834]: I1126 12:20:54.435426 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" event={"ID":"c81bac3a-9953-4132-9869-f1f12d228844","Type":"ContainerStarted","Data":"42dc0b8b8adee9287d1fd2431f2fe39aec8ba38eab6ab609740c75f2b1d40627"} Nov 26 12:20:56 crc kubenswrapper[4834]: I1126 12:20:56.446899 4834 generic.go:334] "Generic (PLEG): container finished" podID="c81bac3a-9953-4132-9869-f1f12d228844" containerID="133082e87711444ee4218c3502efae27545d412d2232212581ff36e73668ec4e" exitCode=0 Nov 26 12:20:56 crc kubenswrapper[4834]: I1126 12:20:56.447197 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" event={"ID":"c81bac3a-9953-4132-9869-f1f12d228844","Type":"ContainerDied","Data":"133082e87711444ee4218c3502efae27545d412d2232212581ff36e73668ec4e"} Nov 26 12:20:57 crc kubenswrapper[4834]: I1126 12:20:57.454048 4834 generic.go:334] "Generic (PLEG): container finished" podID="c81bac3a-9953-4132-9869-f1f12d228844" containerID="0b40d95c717760cff20b731b81665682873f681c8f134a566ceef197c33f0a1e" exitCode=0 Nov 26 12:20:57 crc kubenswrapper[4834]: I1126 12:20:57.454091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" event={"ID":"c81bac3a-9953-4132-9869-f1f12d228844","Type":"ContainerDied","Data":"0b40d95c717760cff20b731b81665682873f681c8f134a566ceef197c33f0a1e"} Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.651956 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.819705 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-bundle\") pod \"c81bac3a-9953-4132-9869-f1f12d228844\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.819788 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-util\") pod \"c81bac3a-9953-4132-9869-f1f12d228844\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.819852 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbkr9\" (UniqueName: \"kubernetes.io/projected/c81bac3a-9953-4132-9869-f1f12d228844-kube-api-access-mbkr9\") pod \"c81bac3a-9953-4132-9869-f1f12d228844\" (UID: \"c81bac3a-9953-4132-9869-f1f12d228844\") " Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.820391 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-bundle" (OuterVolumeSpecName: "bundle") pod "c81bac3a-9953-4132-9869-f1f12d228844" (UID: "c81bac3a-9953-4132-9869-f1f12d228844"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.825040 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81bac3a-9953-4132-9869-f1f12d228844-kube-api-access-mbkr9" (OuterVolumeSpecName: "kube-api-access-mbkr9") pod "c81bac3a-9953-4132-9869-f1f12d228844" (UID: "c81bac3a-9953-4132-9869-f1f12d228844"). InnerVolumeSpecName "kube-api-access-mbkr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.830008 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-util" (OuterVolumeSpecName: "util") pod "c81bac3a-9953-4132-9869-f1f12d228844" (UID: "c81bac3a-9953-4132-9869-f1f12d228844"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.921074 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.921105 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c81bac3a-9953-4132-9869-f1f12d228844-util\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:58 crc kubenswrapper[4834]: I1126 12:20:58.921115 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbkr9\" (UniqueName: \"kubernetes.io/projected/c81bac3a-9953-4132-9869-f1f12d228844-kube-api-access-mbkr9\") on node \"crc\" DevicePath \"\"" Nov 26 12:20:59 crc kubenswrapper[4834]: I1126 12:20:59.467749 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" event={"ID":"c81bac3a-9953-4132-9869-f1f12d228844","Type":"ContainerDied","Data":"42dc0b8b8adee9287d1fd2431f2fe39aec8ba38eab6ab609740c75f2b1d40627"} Nov 26 12:20:59 crc kubenswrapper[4834]: I1126 12:20:59.467797 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42dc0b8b8adee9287d1fd2431f2fe39aec8ba38eab6ab609740c75f2b1d40627" Nov 26 12:20:59 crc kubenswrapper[4834]: I1126 12:20:59.467843 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.176299 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-xhzg5"] Nov 26 12:21:01 crc kubenswrapper[4834]: E1126 12:21:01.176780 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81bac3a-9953-4132-9869-f1f12d228844" containerName="extract" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.176792 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81bac3a-9953-4132-9869-f1f12d228844" containerName="extract" Nov 26 12:21:01 crc kubenswrapper[4834]: E1126 12:21:01.176803 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81bac3a-9953-4132-9869-f1f12d228844" containerName="util" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.176810 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81bac3a-9953-4132-9869-f1f12d228844" containerName="util" Nov 26 12:21:01 crc kubenswrapper[4834]: E1126 12:21:01.176820 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81bac3a-9953-4132-9869-f1f12d228844" containerName="pull" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.176826 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81bac3a-9953-4132-9869-f1f12d228844" containerName="pull" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.176899 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81bac3a-9953-4132-9869-f1f12d228844" containerName="extract" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.177209 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-xhzg5" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.179205 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.180066 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.180287 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gg2dj" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.191076 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-xhzg5"] Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.348257 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5nsh\" (UniqueName: \"kubernetes.io/projected/0783fd9d-f205-4dee-87a3-be44ae70102a-kube-api-access-w5nsh\") pod \"nmstate-operator-557fdffb88-xhzg5\" (UID: \"0783fd9d-f205-4dee-87a3-be44ae70102a\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-xhzg5" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.450156 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5nsh\" (UniqueName: \"kubernetes.io/projected/0783fd9d-f205-4dee-87a3-be44ae70102a-kube-api-access-w5nsh\") pod \"nmstate-operator-557fdffb88-xhzg5\" (UID: \"0783fd9d-f205-4dee-87a3-be44ae70102a\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-xhzg5" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.465697 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5nsh\" (UniqueName: \"kubernetes.io/projected/0783fd9d-f205-4dee-87a3-be44ae70102a-kube-api-access-w5nsh\") pod \"nmstate-operator-557fdffb88-xhzg5\" (UID: \"0783fd9d-f205-4dee-87a3-be44ae70102a\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-xhzg5" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.488688 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-xhzg5" Nov 26 12:21:01 crc kubenswrapper[4834]: I1126 12:21:01.624826 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-xhzg5"] Nov 26 12:21:02 crc kubenswrapper[4834]: I1126 12:21:02.482787 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-xhzg5" event={"ID":"0783fd9d-f205-4dee-87a3-be44ae70102a","Type":"ContainerStarted","Data":"225ea226c36c4a8beab6d338f810ef267c08d8cb6419da8285a034192a437018"} Nov 26 12:21:04 crc kubenswrapper[4834]: I1126 12:21:04.494540 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-xhzg5" event={"ID":"0783fd9d-f205-4dee-87a3-be44ae70102a","Type":"ContainerStarted","Data":"9056812eb74b87fd7d5a317f90173e22faffbc470d528930ee080542cb4073e5"} Nov 26 12:21:04 crc kubenswrapper[4834]: I1126 12:21:04.507891 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-xhzg5" podStartSLOduration=1.40663644 podStartE2EDuration="3.507879505s" podCreationTimestamp="2025-11-26 12:21:01 +0000 UTC" firstStartedPulling="2025-11-26 12:21:01.633706237 +0000 UTC m=+559.540919589" lastFinishedPulling="2025-11-26 12:21:03.734949301 +0000 UTC m=+561.642162654" observedRunningTime="2025-11-26 12:21:04.505637749 +0000 UTC m=+562.412851102" watchObservedRunningTime="2025-11-26 12:21:04.507879505 +0000 UTC m=+562.415092858" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.276754 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n"] Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.277726 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.279503 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-z5dhb" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.288584 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7"] Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.290180 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.292934 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.299271 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpvbg\" (UniqueName: \"kubernetes.io/projected/28c26289-ba85-46ce-b757-883b0ab3db27-kube-api-access-lpvbg\") pod \"nmstate-metrics-5dcf9c57c5-8847n\" (UID: \"28c26289-ba85-46ce-b757-883b0ab3db27\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.299419 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw4zs\" (UniqueName: \"kubernetes.io/projected/62e4d8b8-503c-4ce0-b534-95d90fae4c76-kube-api-access-zw4zs\") pod \"nmstate-webhook-6b89b748d8-p4hf7\" (UID: \"62e4d8b8-503c-4ce0-b534-95d90fae4c76\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.299600 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/62e4d8b8-503c-4ce0-b534-95d90fae4c76-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-p4hf7\" (UID: \"62e4d8b8-503c-4ce0-b534-95d90fae4c76\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.311767 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7"] Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.316739 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mxvrw"] Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.319820 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.334125 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n"] Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.400788 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpszn\" (UniqueName: \"kubernetes.io/projected/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-kube-api-access-cpszn\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.400938 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-nmstate-lock\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.401072 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/62e4d8b8-503c-4ce0-b534-95d90fae4c76-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-p4hf7\" (UID: \"62e4d8b8-503c-4ce0-b534-95d90fae4c76\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.401152 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-dbus-socket\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.401189 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvbg\" (UniqueName: \"kubernetes.io/projected/28c26289-ba85-46ce-b757-883b0ab3db27-kube-api-access-lpvbg\") pod \"nmstate-metrics-5dcf9c57c5-8847n\" (UID: \"28c26289-ba85-46ce-b757-883b0ab3db27\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.401248 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4zs\" (UniqueName: \"kubernetes.io/projected/62e4d8b8-503c-4ce0-b534-95d90fae4c76-kube-api-access-zw4zs\") pod \"nmstate-webhook-6b89b748d8-p4hf7\" (UID: \"62e4d8b8-503c-4ce0-b534-95d90fae4c76\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.401268 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-ovs-socket\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: E1126 12:21:05.401456 4834 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 26 12:21:05 crc kubenswrapper[4834]: E1126 12:21:05.401520 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e4d8b8-503c-4ce0-b534-95d90fae4c76-tls-key-pair podName:62e4d8b8-503c-4ce0-b534-95d90fae4c76 nodeName:}" failed. No retries permitted until 2025-11-26 12:21:05.901500748 +0000 UTC m=+563.808714100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/62e4d8b8-503c-4ce0-b534-95d90fae4c76-tls-key-pair") pod "nmstate-webhook-6b89b748d8-p4hf7" (UID: "62e4d8b8-503c-4ce0-b534-95d90fae4c76") : secret "openshift-nmstate-webhook" not found Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.424062 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvbg\" (UniqueName: \"kubernetes.io/projected/28c26289-ba85-46ce-b757-883b0ab3db27-kube-api-access-lpvbg\") pod \"nmstate-metrics-5dcf9c57c5-8847n\" (UID: \"28c26289-ba85-46ce-b757-883b0ab3db27\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.424212 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw4zs\" (UniqueName: \"kubernetes.io/projected/62e4d8b8-503c-4ce0-b534-95d90fae4c76-kube-api-access-zw4zs\") pod \"nmstate-webhook-6b89b748d8-p4hf7\" (UID: \"62e4d8b8-503c-4ce0-b534-95d90fae4c76\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.479961 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw"] Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.480730 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.482877 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.482966 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.483026 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6wkx9" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.501851 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-ovs-socket\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.501952 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpszn\" (UniqueName: \"kubernetes.io/projected/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-kube-api-access-cpszn\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.501979 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-nmstate-lock\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.502002 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztnxp\" (UniqueName: \"kubernetes.io/projected/1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f-kube-api-access-ztnxp\") pod \"nmstate-console-plugin-5874bd7bc5-nrvxw\" (UID: \"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.502004 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-ovs-socket\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.502067 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-nrvxw\" (UID: \"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.502091 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-nmstate-lock\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.502114 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-dbus-socket\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.502478 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-dbus-socket\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.502472 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-nrvxw\" (UID: \"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.518088 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpszn\" (UniqueName: \"kubernetes.io/projected/dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61-kube-api-access-cpszn\") pod \"nmstate-handler-mxvrw\" (UID: \"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61\") " pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.527020 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw"] Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.591913 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.603149 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-nrvxw\" (UID: \"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.603233 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztnxp\" (UniqueName: \"kubernetes.io/projected/1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f-kube-api-access-ztnxp\") pod \"nmstate-console-plugin-5874bd7bc5-nrvxw\" (UID: \"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.603294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-nrvxw\" (UID: \"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.604666 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-nrvxw\" (UID: \"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.610212 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-nrvxw\" (UID: \"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.628883 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztnxp\" (UniqueName: \"kubernetes.io/projected/1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f-kube-api-access-ztnxp\") pod \"nmstate-console-plugin-5874bd7bc5-nrvxw\" (UID: \"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.633765 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.669893 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d5fd7569-b6vhx"] Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.685056 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.714688 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d5fd7569-b6vhx"] Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.794569 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.798798 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n"] Nov 26 12:21:05 crc kubenswrapper[4834]: W1126 12:21:05.804410 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c26289_ba85_46ce_b757_883b0ab3db27.slice/crio-b8747e84d4de829a791d67bd0bb2a233d04f641b74ae26043cb5f58f4d6ac408 WatchSource:0}: Error finding container b8747e84d4de829a791d67bd0bb2a233d04f641b74ae26043cb5f58f4d6ac408: Status 404 returned error can't find the container with id b8747e84d4de829a791d67bd0bb2a233d04f641b74ae26043cb5f58f4d6ac408 Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.805347 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4077090d-6403-4b4c-bb21-129bb9690fa5-console-serving-cert\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.805394 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-service-ca\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.805454 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-console-config\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.805496 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-oauth-serving-cert\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.805516 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckmv\" (UniqueName: \"kubernetes.io/projected/4077090d-6403-4b4c-bb21-129bb9690fa5-kube-api-access-zckmv\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.805535 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4077090d-6403-4b4c-bb21-129bb9690fa5-console-oauth-config\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.805559 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-trusted-ca-bundle\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.907966 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-console-config\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.908035 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/62e4d8b8-503c-4ce0-b534-95d90fae4c76-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-p4hf7\" (UID: \"62e4d8b8-503c-4ce0-b534-95d90fae4c76\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.908104 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-oauth-serving-cert\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.908136 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zckmv\" (UniqueName: \"kubernetes.io/projected/4077090d-6403-4b4c-bb21-129bb9690fa5-kube-api-access-zckmv\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.908189 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4077090d-6403-4b4c-bb21-129bb9690fa5-console-oauth-config\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.908350 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-trusted-ca-bundle\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.908426 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4077090d-6403-4b4c-bb21-129bb9690fa5-console-serving-cert\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.908465 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-service-ca\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.909398 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-console-config\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.910420 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-service-ca\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.910546 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-trusted-ca-bundle\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.910774 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4077090d-6403-4b4c-bb21-129bb9690fa5-oauth-serving-cert\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.913147 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4077090d-6403-4b4c-bb21-129bb9690fa5-console-oauth-config\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.913720 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4077090d-6403-4b4c-bb21-129bb9690fa5-console-serving-cert\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.914074 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/62e4d8b8-503c-4ce0-b534-95d90fae4c76-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-p4hf7\" (UID: \"62e4d8b8-503c-4ce0-b534-95d90fae4c76\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:05 crc kubenswrapper[4834]: I1126 12:21:05.923906 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zckmv\" (UniqueName: \"kubernetes.io/projected/4077090d-6403-4b4c-bb21-129bb9690fa5-kube-api-access-zckmv\") pod \"console-64d5fd7569-b6vhx\" (UID: \"4077090d-6403-4b4c-bb21-129bb9690fa5\") " pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.015823 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.154285 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw"] Nov 26 12:21:06 crc kubenswrapper[4834]: W1126 12:21:06.157066 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b9a0efe_1ec3_4a49_b612_a4dd462f9b9f.slice/crio-41b4c8bb76ed9d41ba9351a55eb760533ec55cb09821b0bb6b8426c1583fccdb WatchSource:0}: Error finding container 41b4c8bb76ed9d41ba9351a55eb760533ec55cb09821b0bb6b8426c1583fccdb: Status 404 returned error can't find the container with id 41b4c8bb76ed9d41ba9351a55eb760533ec55cb09821b0bb6b8426c1583fccdb Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.179340 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d5fd7569-b6vhx"] Nov 26 12:21:06 crc kubenswrapper[4834]: W1126 12:21:06.181430 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4077090d_6403_4b4c_bb21_129bb9690fa5.slice/crio-f1c320c155adac15e03ac5b1975a6954b4b140d164ed73e84fee81aed11dd747 WatchSource:0}: Error finding container f1c320c155adac15e03ac5b1975a6954b4b140d164ed73e84fee81aed11dd747: Status 404 returned error can't find the container with id f1c320c155adac15e03ac5b1975a6954b4b140d164ed73e84fee81aed11dd747 Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.203382 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.378167 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7"] Nov 26 12:21:06 crc kubenswrapper[4834]: W1126 12:21:06.380669 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62e4d8b8_503c_4ce0_b534_95d90fae4c76.slice/crio-3636f352d5b129fc3ead3974fe7dd2301222967b7e46e45b24408629b9494b81 WatchSource:0}: Error finding container 3636f352d5b129fc3ead3974fe7dd2301222967b7e46e45b24408629b9494b81: Status 404 returned error can't find the container with id 3636f352d5b129fc3ead3974fe7dd2301222967b7e46e45b24408629b9494b81 Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.504934 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mxvrw" event={"ID":"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61","Type":"ContainerStarted","Data":"bb408dacc4581020d4ebcd48800469b9e0a1fff0c9f73695d08c7e4e316e9dd7"} Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.506140 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" event={"ID":"62e4d8b8-503c-4ce0-b534-95d90fae4c76","Type":"ContainerStarted","Data":"3636f352d5b129fc3ead3974fe7dd2301222967b7e46e45b24408629b9494b81"} Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.507268 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" event={"ID":"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f","Type":"ContainerStarted","Data":"41b4c8bb76ed9d41ba9351a55eb760533ec55cb09821b0bb6b8426c1583fccdb"} Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.508262 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n" event={"ID":"28c26289-ba85-46ce-b757-883b0ab3db27","Type":"ContainerStarted","Data":"b8747e84d4de829a791d67bd0bb2a233d04f641b74ae26043cb5f58f4d6ac408"} Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.509646 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d5fd7569-b6vhx" event={"ID":"4077090d-6403-4b4c-bb21-129bb9690fa5","Type":"ContainerStarted","Data":"9a827d47bbe47c39971582552f9b9d54d8f0583ac9c56ca3102d6b722862bbba"} Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.509673 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d5fd7569-b6vhx" event={"ID":"4077090d-6403-4b4c-bb21-129bb9690fa5","Type":"ContainerStarted","Data":"f1c320c155adac15e03ac5b1975a6954b4b140d164ed73e84fee81aed11dd747"} Nov 26 12:21:06 crc kubenswrapper[4834]: I1126 12:21:06.522558 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d5fd7569-b6vhx" podStartSLOduration=1.5225381 podStartE2EDuration="1.5225381s" podCreationTimestamp="2025-11-26 12:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:21:06.521156471 +0000 UTC m=+564.428369824" watchObservedRunningTime="2025-11-26 12:21:06.5225381 +0000 UTC m=+564.429751452" Nov 26 12:21:09 crc kubenswrapper[4834]: I1126 12:21:09.524655 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" event={"ID":"1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f","Type":"ContainerStarted","Data":"8792c6c87bca857cb365efea44cdb3ac49f4a51515df83a95b50aa50764afd29"} Nov 26 12:21:09 crc kubenswrapper[4834]: I1126 12:21:09.526401 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n" event={"ID":"28c26289-ba85-46ce-b757-883b0ab3db27","Type":"ContainerStarted","Data":"05fd78d50bdcd97d6cc337aad38135d824c0700138feb2ebb8f6f2f9757bd7ed"} Nov 26 12:21:09 crc kubenswrapper[4834]: I1126 12:21:09.528661 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mxvrw" event={"ID":"dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61","Type":"ContainerStarted","Data":"12275cef7ffdd72f3c9ed8b1c245b5d8dd7026c800b2f52f44ea1b5dc67d2a5c"} Nov 26 12:21:09 crc kubenswrapper[4834]: I1126 12:21:09.528752 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:09 crc kubenswrapper[4834]: I1126 12:21:09.531774 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" event={"ID":"62e4d8b8-503c-4ce0-b534-95d90fae4c76","Type":"ContainerStarted","Data":"5dabed0d7a30f5f10c789ecace81b40e648522fe9135d080e667bed3535043ff"} Nov 26 12:21:09 crc kubenswrapper[4834]: I1126 12:21:09.531944 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:09 crc kubenswrapper[4834]: I1126 12:21:09.570083 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" podStartSLOduration=2.194080401 podStartE2EDuration="4.57006253s" podCreationTimestamp="2025-11-26 12:21:05 +0000 UTC" firstStartedPulling="2025-11-26 12:21:06.383543846 +0000 UTC m=+564.290757198" lastFinishedPulling="2025-11-26 12:21:08.759525975 +0000 UTC m=+566.666739327" observedRunningTime="2025-11-26 12:21:09.561848732 +0000 UTC m=+567.469062083" watchObservedRunningTime="2025-11-26 12:21:09.57006253 +0000 UTC m=+567.477275882" Nov 26 12:21:09 crc kubenswrapper[4834]: I1126 12:21:09.570775 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-nrvxw" podStartSLOduration=1.998815724 podStartE2EDuration="4.570768578s" podCreationTimestamp="2025-11-26 12:21:05 +0000 UTC" firstStartedPulling="2025-11-26 12:21:06.159614065 +0000 UTC m=+564.066827417" lastFinishedPulling="2025-11-26 12:21:08.73156692 +0000 UTC m=+566.638780271" observedRunningTime="2025-11-26 12:21:09.541100838 +0000 UTC m=+567.448314190" watchObservedRunningTime="2025-11-26 12:21:09.570768578 +0000 UTC m=+567.477981930" Nov 26 12:21:09 crc kubenswrapper[4834]: I1126 12:21:09.578446 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mxvrw" podStartSLOduration=1.514383972 podStartE2EDuration="4.578428545s" podCreationTimestamp="2025-11-26 12:21:05 +0000 UTC" firstStartedPulling="2025-11-26 12:21:05.670338254 +0000 UTC m=+563.577551605" lastFinishedPulling="2025-11-26 12:21:08.734382825 +0000 UTC m=+566.641596178" observedRunningTime="2025-11-26 12:21:09.573149156 +0000 UTC m=+567.480362508" watchObservedRunningTime="2025-11-26 12:21:09.578428545 +0000 UTC m=+567.485641897" Nov 26 12:21:11 crc kubenswrapper[4834]: I1126 12:21:11.546708 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n" event={"ID":"28c26289-ba85-46ce-b757-883b0ab3db27","Type":"ContainerStarted","Data":"a9692121129c87062855a3af674ad678070548f7ef88d49dacfe8ecc16b2ab0f"} Nov 26 12:21:15 crc kubenswrapper[4834]: I1126 12:21:15.653600 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mxvrw" Nov 26 12:21:15 crc kubenswrapper[4834]: I1126 12:21:15.667874 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-8847n" podStartSLOduration=5.52035445 podStartE2EDuration="10.667856945s" podCreationTimestamp="2025-11-26 12:21:05 +0000 UTC" firstStartedPulling="2025-11-26 12:21:05.808012664 +0000 UTC m=+563.715226017" lastFinishedPulling="2025-11-26 12:21:10.955515159 +0000 UTC m=+568.862728512" observedRunningTime="2025-11-26 12:21:11.56124504 +0000 UTC m=+569.468458391" watchObservedRunningTime="2025-11-26 12:21:15.667856945 +0000 UTC m=+573.575070297" Nov 26 12:21:16 crc kubenswrapper[4834]: I1126 12:21:16.016542 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:16 crc kubenswrapper[4834]: I1126 12:21:16.016597 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:16 crc kubenswrapper[4834]: I1126 12:21:16.022562 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:16 crc kubenswrapper[4834]: I1126 12:21:16.578326 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64d5fd7569-b6vhx" Nov 26 12:21:16 crc kubenswrapper[4834]: I1126 12:21:16.617434 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-z2dpf"] Nov 26 12:21:21 crc kubenswrapper[4834]: I1126 12:21:21.531830 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:21:21 crc kubenswrapper[4834]: I1126 12:21:21.532139 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:21:21 crc kubenswrapper[4834]: I1126 12:21:21.532186 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:21:21 crc kubenswrapper[4834]: I1126 12:21:21.532544 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"758813397b9c2c2ccad432b9cb11a0273141f48fa83e1aaad525d33be881d337"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:21:21 crc kubenswrapper[4834]: I1126 12:21:21.532593 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://758813397b9c2c2ccad432b9cb11a0273141f48fa83e1aaad525d33be881d337" gracePeriod=600 Nov 26 12:21:22 crc kubenswrapper[4834]: I1126 12:21:22.603138 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="758813397b9c2c2ccad432b9cb11a0273141f48fa83e1aaad525d33be881d337" exitCode=0 Nov 26 12:21:22 crc kubenswrapper[4834]: I1126 12:21:22.603218 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"758813397b9c2c2ccad432b9cb11a0273141f48fa83e1aaad525d33be881d337"} Nov 26 12:21:22 crc kubenswrapper[4834]: I1126 12:21:22.603468 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"a287c3eb83da9750c870469d031aef3c2e315d099f1dc03a585c96cc0c81709e"} Nov 26 12:21:22 crc kubenswrapper[4834]: I1126 12:21:22.603496 4834 scope.go:117] "RemoveContainer" containerID="f165beb62e23145bceea6aef6fcd553d3b72544fd470871ee30eb868544a4707" Nov 26 12:21:26 crc kubenswrapper[4834]: I1126 12:21:26.210214 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-p4hf7" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.692520 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544"] Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.693944 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.695495 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.704145 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544"] Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.778694 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.778763 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.778857 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmz9\" (UniqueName: \"kubernetes.io/projected/1c0edd6e-551f-4a07-9201-bc4b9d13a796-kube-api-access-xcmz9\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.880528 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmz9\" (UniqueName: \"kubernetes.io/projected/1c0edd6e-551f-4a07-9201-bc4b9d13a796-kube-api-access-xcmz9\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.880729 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.880795 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.881164 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.881414 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:35 crc kubenswrapper[4834]: I1126 12:21:35.898471 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmz9\" (UniqueName: \"kubernetes.io/projected/1c0edd6e-551f-4a07-9201-bc4b9d13a796-kube-api-access-xcmz9\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:36 crc kubenswrapper[4834]: I1126 12:21:36.007055 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:36 crc kubenswrapper[4834]: I1126 12:21:36.369419 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544"] Nov 26 12:21:36 crc kubenswrapper[4834]: I1126 12:21:36.683090 4834 generic.go:334] "Generic (PLEG): container finished" podID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerID="464d76a5dd4c990c8c6ce26a7b1d07e30f64e0de4923d6f268d495c24bde873e" exitCode=0 Nov 26 12:21:36 crc kubenswrapper[4834]: I1126 12:21:36.683143 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" event={"ID":"1c0edd6e-551f-4a07-9201-bc4b9d13a796","Type":"ContainerDied","Data":"464d76a5dd4c990c8c6ce26a7b1d07e30f64e0de4923d6f268d495c24bde873e"} Nov 26 12:21:36 crc kubenswrapper[4834]: I1126 12:21:36.683174 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" event={"ID":"1c0edd6e-551f-4a07-9201-bc4b9d13a796","Type":"ContainerStarted","Data":"6c7c18d4ccff1e8edfb6202675a565312a981712276da7e6dd49f5e4e47ecc65"} Nov 26 12:21:38 crc kubenswrapper[4834]: I1126 12:21:38.694095 4834 generic.go:334] "Generic (PLEG): container finished" podID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerID="986c4460866c267cddeb7224358b0996bf39169e2f9028cdb46d8b74a4ed5246" exitCode=0 Nov 26 12:21:38 crc kubenswrapper[4834]: I1126 12:21:38.694167 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" event={"ID":"1c0edd6e-551f-4a07-9201-bc4b9d13a796","Type":"ContainerDied","Data":"986c4460866c267cddeb7224358b0996bf39169e2f9028cdb46d8b74a4ed5246"} Nov 26 12:21:39 crc kubenswrapper[4834]: I1126 12:21:39.701996 4834 generic.go:334] "Generic (PLEG): container finished" podID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerID="b67b77dea635de88c4e3294aaab2aa54d9a11ec2ffc4c0b228d720d0fbc9875b" exitCode=0 Nov 26 12:21:39 crc kubenswrapper[4834]: I1126 12:21:39.702062 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" event={"ID":"1c0edd6e-551f-4a07-9201-bc4b9d13a796","Type":"ContainerDied","Data":"b67b77dea635de88c4e3294aaab2aa54d9a11ec2ffc4c0b228d720d0fbc9875b"} Nov 26 12:21:40 crc kubenswrapper[4834]: I1126 12:21:40.885278 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:40 crc kubenswrapper[4834]: I1126 12:21:40.941514 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcmz9\" (UniqueName: \"kubernetes.io/projected/1c0edd6e-551f-4a07-9201-bc4b9d13a796-kube-api-access-xcmz9\") pod \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " Nov 26 12:21:40 crc kubenswrapper[4834]: I1126 12:21:40.941792 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-util\") pod \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " Nov 26 12:21:40 crc kubenswrapper[4834]: I1126 12:21:40.941886 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-bundle\") pod \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\" (UID: \"1c0edd6e-551f-4a07-9201-bc4b9d13a796\") " Nov 26 12:21:40 crc kubenswrapper[4834]: I1126 12:21:40.943378 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-bundle" (OuterVolumeSpecName: "bundle") pod "1c0edd6e-551f-4a07-9201-bc4b9d13a796" (UID: "1c0edd6e-551f-4a07-9201-bc4b9d13a796"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:21:40 crc kubenswrapper[4834]: I1126 12:21:40.947255 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0edd6e-551f-4a07-9201-bc4b9d13a796-kube-api-access-xcmz9" (OuterVolumeSpecName: "kube-api-access-xcmz9") pod "1c0edd6e-551f-4a07-9201-bc4b9d13a796" (UID: "1c0edd6e-551f-4a07-9201-bc4b9d13a796"). InnerVolumeSpecName "kube-api-access-xcmz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:21:40 crc kubenswrapper[4834]: I1126 12:21:40.952586 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-util" (OuterVolumeSpecName: "util") pod "1c0edd6e-551f-4a07-9201-bc4b9d13a796" (UID: "1c0edd6e-551f-4a07-9201-bc4b9d13a796"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:21:41 crc kubenswrapper[4834]: I1126 12:21:41.043373 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcmz9\" (UniqueName: \"kubernetes.io/projected/1c0edd6e-551f-4a07-9201-bc4b9d13a796-kube-api-access-xcmz9\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:41 crc kubenswrapper[4834]: I1126 12:21:41.043410 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-util\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:41 crc kubenswrapper[4834]: I1126 12:21:41.043422 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c0edd6e-551f-4a07-9201-bc4b9d13a796-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:41 crc kubenswrapper[4834]: I1126 12:21:41.650922 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-z2dpf" podUID="17225c8b-a6dd-4958-b1be-58b3cc4ad317" containerName="console" containerID="cri-o://073177f09678c962a9da978555323d2dd073f556577ea158fb4849d48bcfbbb5" gracePeriod=15 Nov 26 12:21:41 crc kubenswrapper[4834]: I1126 12:21:41.715558 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" event={"ID":"1c0edd6e-551f-4a07-9201-bc4b9d13a796","Type":"ContainerDied","Data":"6c7c18d4ccff1e8edfb6202675a565312a981712276da7e6dd49f5e4e47ecc65"} Nov 26 12:21:41 crc kubenswrapper[4834]: I1126 12:21:41.715628 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7c18d4ccff1e8edfb6202675a565312a981712276da7e6dd49f5e4e47ecc65" Nov 26 12:21:41 crc kubenswrapper[4834]: I1126 12:21:41.715600 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544" Nov 26 12:21:41 crc kubenswrapper[4834]: I1126 12:21:41.980602 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-z2dpf_17225c8b-a6dd-4958-b1be-58b3cc4ad317/console/0.log" Nov 26 12:21:41 crc kubenswrapper[4834]: I1126 12:21:41.980668 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.054902 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-oauth-serving-cert\") pod \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.054955 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-config\") pod \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.054992 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-oauth-config\") pod \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.055101 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-trusted-ca-bundle\") pod \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.055189 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tm6g\" (UniqueName: \"kubernetes.io/projected/17225c8b-a6dd-4958-b1be-58b3cc4ad317-kube-api-access-7tm6g\") pod \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.055282 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-service-ca\") pod \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.055355 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-serving-cert\") pod \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\" (UID: \"17225c8b-a6dd-4958-b1be-58b3cc4ad317\") " Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.056232 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "17225c8b-a6dd-4958-b1be-58b3cc4ad317" (UID: "17225c8b-a6dd-4958-b1be-58b3cc4ad317"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.056268 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-service-ca" (OuterVolumeSpecName: "service-ca") pod "17225c8b-a6dd-4958-b1be-58b3cc4ad317" (UID: "17225c8b-a6dd-4958-b1be-58b3cc4ad317"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.056237 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-config" (OuterVolumeSpecName: "console-config") pod "17225c8b-a6dd-4958-b1be-58b3cc4ad317" (UID: "17225c8b-a6dd-4958-b1be-58b3cc4ad317"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.056675 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "17225c8b-a6dd-4958-b1be-58b3cc4ad317" (UID: "17225c8b-a6dd-4958-b1be-58b3cc4ad317"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.060120 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17225c8b-a6dd-4958-b1be-58b3cc4ad317-kube-api-access-7tm6g" (OuterVolumeSpecName: "kube-api-access-7tm6g") pod "17225c8b-a6dd-4958-b1be-58b3cc4ad317" (UID: "17225c8b-a6dd-4958-b1be-58b3cc4ad317"). InnerVolumeSpecName "kube-api-access-7tm6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.060215 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "17225c8b-a6dd-4958-b1be-58b3cc4ad317" (UID: "17225c8b-a6dd-4958-b1be-58b3cc4ad317"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.060560 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "17225c8b-a6dd-4958-b1be-58b3cc4ad317" (UID: "17225c8b-a6dd-4958-b1be-58b3cc4ad317"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.156355 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.156394 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tm6g\" (UniqueName: \"kubernetes.io/projected/17225c8b-a6dd-4958-b1be-58b3cc4ad317-kube-api-access-7tm6g\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.156409 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.156419 4834 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.156430 4834 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.156439 4834 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.156448 4834 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17225c8b-a6dd-4958-b1be-58b3cc4ad317-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.620132 4834 scope.go:117] "RemoveContainer" containerID="073177f09678c962a9da978555323d2dd073f556577ea158fb4849d48bcfbbb5" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.720870 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z2dpf" event={"ID":"17225c8b-a6dd-4958-b1be-58b3cc4ad317","Type":"ContainerDied","Data":"073177f09678c962a9da978555323d2dd073f556577ea158fb4849d48bcfbbb5"} Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.720928 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z2dpf" Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.720947 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z2dpf" event={"ID":"17225c8b-a6dd-4958-b1be-58b3cc4ad317","Type":"ContainerDied","Data":"166ddd96c15e3cb627d255e096a48504057654aea0a71d0ec5238a4e0f786fbd"} Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.753662 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-z2dpf"] Nov 26 12:21:42 crc kubenswrapper[4834]: I1126 12:21:42.759528 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-z2dpf"] Nov 26 12:21:44 crc kubenswrapper[4834]: I1126 12:21:44.422980 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17225c8b-a6dd-4958-b1be-58b3cc4ad317" path="/var/lib/kubelet/pods/17225c8b-a6dd-4958-b1be-58b3cc4ad317/volumes" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.220298 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-558846f6-bdb45"] Nov 26 12:21:50 crc kubenswrapper[4834]: E1126 12:21:50.220702 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerName="extract" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.220714 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerName="extract" Nov 26 12:21:50 crc kubenswrapper[4834]: E1126 12:21:50.220724 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerName="pull" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.220729 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerName="pull" Nov 26 12:21:50 crc kubenswrapper[4834]: E1126 12:21:50.220738 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17225c8b-a6dd-4958-b1be-58b3cc4ad317" containerName="console" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.220744 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="17225c8b-a6dd-4958-b1be-58b3cc4ad317" containerName="console" Nov 26 12:21:50 crc kubenswrapper[4834]: E1126 12:21:50.220752 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerName="util" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.220756 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerName="util" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.220833 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="17225c8b-a6dd-4958-b1be-58b3cc4ad317" containerName="console" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.220847 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0edd6e-551f-4a07-9201-bc4b9d13a796" containerName="extract" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.221173 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.224952 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.225245 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.225425 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.225557 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.225673 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5j7mq" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.233714 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-558846f6-bdb45"] Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.364949 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6src\" (UniqueName: \"kubernetes.io/projected/7038dc0b-b755-4c32-871a-27327898c558-kube-api-access-w6src\") pod \"metallb-operator-controller-manager-558846f6-bdb45\" (UID: \"7038dc0b-b755-4c32-871a-27327898c558\") " pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.365463 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7038dc0b-b755-4c32-871a-27327898c558-apiservice-cert\") pod \"metallb-operator-controller-manager-558846f6-bdb45\" (UID: \"7038dc0b-b755-4c32-871a-27327898c558\") " pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.365556 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7038dc0b-b755-4c32-871a-27327898c558-webhook-cert\") pod \"metallb-operator-controller-manager-558846f6-bdb45\" (UID: \"7038dc0b-b755-4c32-871a-27327898c558\") " pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.467103 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7038dc0b-b755-4c32-871a-27327898c558-webhook-cert\") pod \"metallb-operator-controller-manager-558846f6-bdb45\" (UID: \"7038dc0b-b755-4c32-871a-27327898c558\") " pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.467185 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6src\" (UniqueName: \"kubernetes.io/projected/7038dc0b-b755-4c32-871a-27327898c558-kube-api-access-w6src\") pod \"metallb-operator-controller-manager-558846f6-bdb45\" (UID: \"7038dc0b-b755-4c32-871a-27327898c558\") " pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.467227 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7038dc0b-b755-4c32-871a-27327898c558-apiservice-cert\") pod \"metallb-operator-controller-manager-558846f6-bdb45\" (UID: \"7038dc0b-b755-4c32-871a-27327898c558\") " pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.470888 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-884765884-jmk72"] Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.471542 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.473667 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-mcrvw" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.474251 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7038dc0b-b755-4c32-871a-27327898c558-apiservice-cert\") pod \"metallb-operator-controller-manager-558846f6-bdb45\" (UID: \"7038dc0b-b755-4c32-871a-27327898c558\") " pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.474392 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.485432 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-884765884-jmk72"] Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.495583 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.496016 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7038dc0b-b755-4c32-871a-27327898c558-webhook-cert\") pod \"metallb-operator-controller-manager-558846f6-bdb45\" (UID: \"7038dc0b-b755-4c32-871a-27327898c558\") " pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.504880 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6src\" (UniqueName: \"kubernetes.io/projected/7038dc0b-b755-4c32-871a-27327898c558-kube-api-access-w6src\") pod \"metallb-operator-controller-manager-558846f6-bdb45\" (UID: \"7038dc0b-b755-4c32-871a-27327898c558\") " pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.536651 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.678663 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s426r\" (UniqueName: \"kubernetes.io/projected/a8711e7a-0128-4160-9be7-6b25ccdf2ea6-kube-api-access-s426r\") pod \"metallb-operator-webhook-server-884765884-jmk72\" (UID: \"a8711e7a-0128-4160-9be7-6b25ccdf2ea6\") " pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.678765 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8711e7a-0128-4160-9be7-6b25ccdf2ea6-apiservice-cert\") pod \"metallb-operator-webhook-server-884765884-jmk72\" (UID: \"a8711e7a-0128-4160-9be7-6b25ccdf2ea6\") " pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.678794 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8711e7a-0128-4160-9be7-6b25ccdf2ea6-webhook-cert\") pod \"metallb-operator-webhook-server-884765884-jmk72\" (UID: \"a8711e7a-0128-4160-9be7-6b25ccdf2ea6\") " pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.779781 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s426r\" (UniqueName: \"kubernetes.io/projected/a8711e7a-0128-4160-9be7-6b25ccdf2ea6-kube-api-access-s426r\") pod \"metallb-operator-webhook-server-884765884-jmk72\" (UID: \"a8711e7a-0128-4160-9be7-6b25ccdf2ea6\") " pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.779828 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8711e7a-0128-4160-9be7-6b25ccdf2ea6-webhook-cert\") pod \"metallb-operator-webhook-server-884765884-jmk72\" (UID: \"a8711e7a-0128-4160-9be7-6b25ccdf2ea6\") " pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.779851 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8711e7a-0128-4160-9be7-6b25ccdf2ea6-apiservice-cert\") pod \"metallb-operator-webhook-server-884765884-jmk72\" (UID: \"a8711e7a-0128-4160-9be7-6b25ccdf2ea6\") " pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.784517 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8711e7a-0128-4160-9be7-6b25ccdf2ea6-apiservice-cert\") pod \"metallb-operator-webhook-server-884765884-jmk72\" (UID: \"a8711e7a-0128-4160-9be7-6b25ccdf2ea6\") " pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.784549 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8711e7a-0128-4160-9be7-6b25ccdf2ea6-webhook-cert\") pod \"metallb-operator-webhook-server-884765884-jmk72\" (UID: \"a8711e7a-0128-4160-9be7-6b25ccdf2ea6\") " pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.800985 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s426r\" (UniqueName: \"kubernetes.io/projected/a8711e7a-0128-4160-9be7-6b25ccdf2ea6-kube-api-access-s426r\") pod \"metallb-operator-webhook-server-884765884-jmk72\" (UID: \"a8711e7a-0128-4160-9be7-6b25ccdf2ea6\") " pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.835581 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:50 crc kubenswrapper[4834]: I1126 12:21:50.963832 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-558846f6-bdb45"] Nov 26 12:21:50 crc kubenswrapper[4834]: W1126 12:21:50.968902 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7038dc0b_b755_4c32_871a_27327898c558.slice/crio-d5c326cefde8e92bd27c9a0f3c5452ba42cea8623e7fa259dd376b9b478114d9 WatchSource:0}: Error finding container d5c326cefde8e92bd27c9a0f3c5452ba42cea8623e7fa259dd376b9b478114d9: Status 404 returned error can't find the container with id d5c326cefde8e92bd27c9a0f3c5452ba42cea8623e7fa259dd376b9b478114d9 Nov 26 12:21:51 crc kubenswrapper[4834]: I1126 12:21:51.199683 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-884765884-jmk72"] Nov 26 12:21:51 crc kubenswrapper[4834]: W1126 12:21:51.203887 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8711e7a_0128_4160_9be7_6b25ccdf2ea6.slice/crio-41f7d2c679277766d70ca43fc5e379bc17e61950ff2a3c3bfbffed6ab8a7607b WatchSource:0}: Error finding container 41f7d2c679277766d70ca43fc5e379bc17e61950ff2a3c3bfbffed6ab8a7607b: Status 404 returned error can't find the container with id 41f7d2c679277766d70ca43fc5e379bc17e61950ff2a3c3bfbffed6ab8a7607b Nov 26 12:21:51 crc kubenswrapper[4834]: I1126 12:21:51.774887 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" event={"ID":"a8711e7a-0128-4160-9be7-6b25ccdf2ea6","Type":"ContainerStarted","Data":"41f7d2c679277766d70ca43fc5e379bc17e61950ff2a3c3bfbffed6ab8a7607b"} Nov 26 12:21:51 crc kubenswrapper[4834]: I1126 12:21:51.776980 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" event={"ID":"7038dc0b-b755-4c32-871a-27327898c558","Type":"ContainerStarted","Data":"d5c326cefde8e92bd27c9a0f3c5452ba42cea8623e7fa259dd376b9b478114d9"} Nov 26 12:21:55 crc kubenswrapper[4834]: I1126 12:21:55.803967 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" event={"ID":"a8711e7a-0128-4160-9be7-6b25ccdf2ea6","Type":"ContainerStarted","Data":"18284b412dadbf83bf719ed4b943e1490f141fe57e32af560a6196613d4deeb7"} Nov 26 12:21:55 crc kubenswrapper[4834]: I1126 12:21:55.804534 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:21:55 crc kubenswrapper[4834]: I1126 12:21:55.806527 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" event={"ID":"7038dc0b-b755-4c32-871a-27327898c558","Type":"ContainerStarted","Data":"a12080665fe034abb33f9cef56323a04f1722691309b61575e4e7828f8817c28"} Nov 26 12:21:55 crc kubenswrapper[4834]: I1126 12:21:55.806669 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:21:55 crc kubenswrapper[4834]: I1126 12:21:55.821091 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" podStartSLOduration=1.684900049 podStartE2EDuration="5.821075657s" podCreationTimestamp="2025-11-26 12:21:50 +0000 UTC" firstStartedPulling="2025-11-26 12:21:51.20653278 +0000 UTC m=+609.113746133" lastFinishedPulling="2025-11-26 12:21:55.342708389 +0000 UTC m=+613.249921741" observedRunningTime="2025-11-26 12:21:55.817493997 +0000 UTC m=+613.724707339" watchObservedRunningTime="2025-11-26 12:21:55.821075657 +0000 UTC m=+613.728289008" Nov 26 12:21:55 crc kubenswrapper[4834]: I1126 12:21:55.836010 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" podStartSLOduration=1.482859327 podStartE2EDuration="5.835990274s" podCreationTimestamp="2025-11-26 12:21:50 +0000 UTC" firstStartedPulling="2025-11-26 12:21:50.973593938 +0000 UTC m=+608.880807289" lastFinishedPulling="2025-11-26 12:21:55.326724884 +0000 UTC m=+613.233938236" observedRunningTime="2025-11-26 12:21:55.835471055 +0000 UTC m=+613.742684407" watchObservedRunningTime="2025-11-26 12:21:55.835990274 +0000 UTC m=+613.743203627" Nov 26 12:22:10 crc kubenswrapper[4834]: I1126 12:22:10.841002 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-884765884-jmk72" Nov 26 12:22:30 crc kubenswrapper[4834]: I1126 12:22:30.540955 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-558846f6-bdb45" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.099047 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9bcr6"] Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.101597 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf"] Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.102005 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.102190 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.103838 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.104081 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.104094 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.107145 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qc9d6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.116325 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf"] Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.162538 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gkhb7"] Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.163289 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.165169 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.165239 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-8f7hx" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.165539 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.181413 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.198911 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-fsv98"] Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.199961 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.202012 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.206018 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-fsv98"] Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.256547 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-frr-sockets\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.256724 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-metrics-certs\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.256806 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f9e616-aa50-4bbb-adf2-d635f4ed3b0b-cert\") pod \"frr-k8s-webhook-server-6998585d5-jwdtf\" (UID: \"18f9e616-aa50-4bbb-adf2-d635f4ed3b0b\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.256885 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a96c765a-d06b-44f4-9c21-41abc86dfa7c-cert\") pod \"controller-6c7b4b5f48-fsv98\" (UID: \"a96c765a-d06b-44f4-9c21-41abc86dfa7c\") " pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.256958 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a96c765a-d06b-44f4-9c21-41abc86dfa7c-metrics-certs\") pod \"controller-6c7b4b5f48-fsv98\" (UID: \"a96c765a-d06b-44f4-9c21-41abc86dfa7c\") " pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257046 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-frr-conf\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257090 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmsdf\" (UniqueName: \"kubernetes.io/projected/08a409b5-356f-4044-b624-bc95b687b192-kube-api-access-lmsdf\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257115 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cd4f4222-097e-4ec4-840c-8acd707eb05c-metallb-excludel2\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257140 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-metrics\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257217 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8krd\" (UniqueName: \"kubernetes.io/projected/cd4f4222-097e-4ec4-840c-8acd707eb05c-kube-api-access-z8krd\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257269 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-reloader\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257295 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnws\" (UniqueName: \"kubernetes.io/projected/a96c765a-d06b-44f4-9c21-41abc86dfa7c-kube-api-access-cgnws\") pod \"controller-6c7b4b5f48-fsv98\" (UID: \"a96c765a-d06b-44f4-9c21-41abc86dfa7c\") " pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257365 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmr4b\" (UniqueName: \"kubernetes.io/projected/18f9e616-aa50-4bbb-adf2-d635f4ed3b0b-kube-api-access-dmr4b\") pod \"frr-k8s-webhook-server-6998585d5-jwdtf\" (UID: \"18f9e616-aa50-4bbb-adf2-d635f4ed3b0b\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257383 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-memberlist\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257404 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08a409b5-356f-4044-b624-bc95b687b192-metrics-certs\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.257488 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/08a409b5-356f-4044-b624-bc95b687b192-frr-startup\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358031 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/08a409b5-356f-4044-b624-bc95b687b192-frr-startup\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358075 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-frr-sockets\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358112 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-metrics-certs\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358131 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f9e616-aa50-4bbb-adf2-d635f4ed3b0b-cert\") pod \"frr-k8s-webhook-server-6998585d5-jwdtf\" (UID: \"18f9e616-aa50-4bbb-adf2-d635f4ed3b0b\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358155 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a96c765a-d06b-44f4-9c21-41abc86dfa7c-cert\") pod \"controller-6c7b4b5f48-fsv98\" (UID: \"a96c765a-d06b-44f4-9c21-41abc86dfa7c\") " pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358188 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a96c765a-d06b-44f4-9c21-41abc86dfa7c-metrics-certs\") pod \"controller-6c7b4b5f48-fsv98\" (UID: \"a96c765a-d06b-44f4-9c21-41abc86dfa7c\") " pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358211 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-frr-conf\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358237 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmsdf\" (UniqueName: \"kubernetes.io/projected/08a409b5-356f-4044-b624-bc95b687b192-kube-api-access-lmsdf\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358268 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cd4f4222-097e-4ec4-840c-8acd707eb05c-metallb-excludel2\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358290 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-metrics\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358341 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8krd\" (UniqueName: \"kubernetes.io/projected/cd4f4222-097e-4ec4-840c-8acd707eb05c-kube-api-access-z8krd\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358357 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-reloader\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358373 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnws\" (UniqueName: \"kubernetes.io/projected/a96c765a-d06b-44f4-9c21-41abc86dfa7c-kube-api-access-cgnws\") pod \"controller-6c7b4b5f48-fsv98\" (UID: \"a96c765a-d06b-44f4-9c21-41abc86dfa7c\") " pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358418 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmr4b\" (UniqueName: \"kubernetes.io/projected/18f9e616-aa50-4bbb-adf2-d635f4ed3b0b-kube-api-access-dmr4b\") pod \"frr-k8s-webhook-server-6998585d5-jwdtf\" (UID: \"18f9e616-aa50-4bbb-adf2-d635f4ed3b0b\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358434 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-memberlist\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358449 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08a409b5-356f-4044-b624-bc95b687b192-metrics-certs\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.358727 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-frr-conf\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.359097 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-metrics\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: E1126 12:22:31.359158 4834 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.359272 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-reloader\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: E1126 12:22:31.359388 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-memberlist podName:cd4f4222-097e-4ec4-840c-8acd707eb05c nodeName:}" failed. No retries permitted until 2025-11-26 12:22:31.859294736 +0000 UTC m=+649.766508088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-memberlist") pod "speaker-gkhb7" (UID: "cd4f4222-097e-4ec4-840c-8acd707eb05c") : secret "metallb-memberlist" not found Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.359652 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/08a409b5-356f-4044-b624-bc95b687b192-frr-sockets\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.360884 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/08a409b5-356f-4044-b624-bc95b687b192-frr-startup\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.364554 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cd4f4222-097e-4ec4-840c-8acd707eb05c-metallb-excludel2\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.365245 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-metrics-certs\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.366725 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f9e616-aa50-4bbb-adf2-d635f4ed3b0b-cert\") pod \"frr-k8s-webhook-server-6998585d5-jwdtf\" (UID: \"18f9e616-aa50-4bbb-adf2-d635f4ed3b0b\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.368977 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a96c765a-d06b-44f4-9c21-41abc86dfa7c-cert\") pod \"controller-6c7b4b5f48-fsv98\" (UID: \"a96c765a-d06b-44f4-9c21-41abc86dfa7c\") " pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.369659 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a96c765a-d06b-44f4-9c21-41abc86dfa7c-metrics-certs\") pod \"controller-6c7b4b5f48-fsv98\" (UID: \"a96c765a-d06b-44f4-9c21-41abc86dfa7c\") " pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.370385 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08a409b5-356f-4044-b624-bc95b687b192-metrics-certs\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.373904 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnws\" (UniqueName: \"kubernetes.io/projected/a96c765a-d06b-44f4-9c21-41abc86dfa7c-kube-api-access-cgnws\") pod \"controller-6c7b4b5f48-fsv98\" (UID: \"a96c765a-d06b-44f4-9c21-41abc86dfa7c\") " pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.374737 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8krd\" (UniqueName: \"kubernetes.io/projected/cd4f4222-097e-4ec4-840c-8acd707eb05c-kube-api-access-z8krd\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.375448 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmr4b\" (UniqueName: \"kubernetes.io/projected/18f9e616-aa50-4bbb-adf2-d635f4ed3b0b-kube-api-access-dmr4b\") pod \"frr-k8s-webhook-server-6998585d5-jwdtf\" (UID: \"18f9e616-aa50-4bbb-adf2-d635f4ed3b0b\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.376422 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmsdf\" (UniqueName: \"kubernetes.io/projected/08a409b5-356f-4044-b624-bc95b687b192-kube-api-access-lmsdf\") pod \"frr-k8s-9bcr6\" (UID: \"08a409b5-356f-4044-b624-bc95b687b192\") " pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.418936 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.426039 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.511943 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.820056 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf"] Nov 26 12:22:31 crc kubenswrapper[4834]: W1126 12:22:31.824928 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18f9e616_aa50_4bbb_adf2_d635f4ed3b0b.slice/crio-2b96f1ca472a0952e683bb3a96d94d117a9de63bcfb0b99fc829184ea45e8bb8 WatchSource:0}: Error finding container 2b96f1ca472a0952e683bb3a96d94d117a9de63bcfb0b99fc829184ea45e8bb8: Status 404 returned error can't find the container with id 2b96f1ca472a0952e683bb3a96d94d117a9de63bcfb0b99fc829184ea45e8bb8 Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.865505 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-memberlist\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:31 crc kubenswrapper[4834]: E1126 12:22:31.865891 4834 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 12:22:31 crc kubenswrapper[4834]: E1126 12:22:31.866019 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-memberlist podName:cd4f4222-097e-4ec4-840c-8acd707eb05c nodeName:}" failed. No retries permitted until 2025-11-26 12:22:32.865987402 +0000 UTC m=+650.773200754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-memberlist") pod "speaker-gkhb7" (UID: "cd4f4222-097e-4ec4-840c-8acd707eb05c") : secret "metallb-memberlist" not found Nov 26 12:22:31 crc kubenswrapper[4834]: I1126 12:22:31.893946 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-fsv98"] Nov 26 12:22:31 crc kubenswrapper[4834]: W1126 12:22:31.897226 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96c765a_d06b_44f4_9c21_41abc86dfa7c.slice/crio-6c134ffab32aa3532d5b1ca0b5c20f76b8498ec075a8f53c723540caa4bffe42 WatchSource:0}: Error finding container 6c134ffab32aa3532d5b1ca0b5c20f76b8498ec075a8f53c723540caa4bffe42: Status 404 returned error can't find the container with id 6c134ffab32aa3532d5b1ca0b5c20f76b8498ec075a8f53c723540caa4bffe42 Nov 26 12:22:32 crc kubenswrapper[4834]: I1126 12:22:32.003923 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" event={"ID":"18f9e616-aa50-4bbb-adf2-d635f4ed3b0b","Type":"ContainerStarted","Data":"2b96f1ca472a0952e683bb3a96d94d117a9de63bcfb0b99fc829184ea45e8bb8"} Nov 26 12:22:32 crc kubenswrapper[4834]: I1126 12:22:32.004822 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-fsv98" event={"ID":"a96c765a-d06b-44f4-9c21-41abc86dfa7c","Type":"ContainerStarted","Data":"6c134ffab32aa3532d5b1ca0b5c20f76b8498ec075a8f53c723540caa4bffe42"} Nov 26 12:22:32 crc kubenswrapper[4834]: I1126 12:22:32.005631 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerStarted","Data":"994eaac36babb50bc634c794a3f21a09285ec16556670d385e31912a4801c7c1"} Nov 26 12:22:32 crc kubenswrapper[4834]: I1126 12:22:32.879863 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-memberlist\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:32 crc kubenswrapper[4834]: I1126 12:22:32.888841 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd4f4222-097e-4ec4-840c-8acd707eb05c-memberlist\") pod \"speaker-gkhb7\" (UID: \"cd4f4222-097e-4ec4-840c-8acd707eb05c\") " pod="metallb-system/speaker-gkhb7" Nov 26 12:22:32 crc kubenswrapper[4834]: I1126 12:22:32.997792 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gkhb7" Nov 26 12:22:33 crc kubenswrapper[4834]: I1126 12:22:33.016375 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-fsv98" event={"ID":"a96c765a-d06b-44f4-9c21-41abc86dfa7c","Type":"ContainerStarted","Data":"e39d1f369fa79d6fee3b7d390294d443ddbd78cd41c718031b1aa6c290f4f2dd"} Nov 26 12:22:33 crc kubenswrapper[4834]: I1126 12:22:33.016439 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-fsv98" event={"ID":"a96c765a-d06b-44f4-9c21-41abc86dfa7c","Type":"ContainerStarted","Data":"90b7b8ee0960f193df750e54af38aef00c608e52dd056a7fdeebf7702adb6e57"} Nov 26 12:22:33 crc kubenswrapper[4834]: I1126 12:22:33.016480 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:33 crc kubenswrapper[4834]: I1126 12:22:33.038323 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-fsv98" podStartSLOduration=2.038289696 podStartE2EDuration="2.038289696s" podCreationTimestamp="2025-11-26 12:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:22:33.036070988 +0000 UTC m=+650.943284340" watchObservedRunningTime="2025-11-26 12:22:33.038289696 +0000 UTC m=+650.945503047" Nov 26 12:22:33 crc kubenswrapper[4834]: W1126 12:22:33.041184 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd4f4222_097e_4ec4_840c_8acd707eb05c.slice/crio-e12a02aaaad1fe6aea17d0a3231577e3cfafe520e013ceba34f17b1b2ab264ae WatchSource:0}: Error finding container e12a02aaaad1fe6aea17d0a3231577e3cfafe520e013ceba34f17b1b2ab264ae: Status 404 returned error can't find the container with id e12a02aaaad1fe6aea17d0a3231577e3cfafe520e013ceba34f17b1b2ab264ae Nov 26 12:22:34 crc kubenswrapper[4834]: I1126 12:22:34.023985 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gkhb7" event={"ID":"cd4f4222-097e-4ec4-840c-8acd707eb05c","Type":"ContainerStarted","Data":"f75959084e5cc41cf34c88b8aae669810ba160b853bf3d02b047e9d5a1e84e9f"} Nov 26 12:22:34 crc kubenswrapper[4834]: I1126 12:22:34.024023 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gkhb7" event={"ID":"cd4f4222-097e-4ec4-840c-8acd707eb05c","Type":"ContainerStarted","Data":"b6522fa14c59b171368074d01880f685b08b727b56c491eb291fc97ed3e805d9"} Nov 26 12:22:34 crc kubenswrapper[4834]: I1126 12:22:34.024032 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gkhb7" event={"ID":"cd4f4222-097e-4ec4-840c-8acd707eb05c","Type":"ContainerStarted","Data":"e12a02aaaad1fe6aea17d0a3231577e3cfafe520e013ceba34f17b1b2ab264ae"} Nov 26 12:22:34 crc kubenswrapper[4834]: I1126 12:22:34.024508 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gkhb7" Nov 26 12:22:34 crc kubenswrapper[4834]: I1126 12:22:34.048871 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gkhb7" podStartSLOduration=3.048860344 podStartE2EDuration="3.048860344s" podCreationTimestamp="2025-11-26 12:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:22:34.046528673 +0000 UTC m=+651.953742025" watchObservedRunningTime="2025-11-26 12:22:34.048860344 +0000 UTC m=+651.956073686" Nov 26 12:22:38 crc kubenswrapper[4834]: I1126 12:22:38.053112 4834 generic.go:334] "Generic (PLEG): container finished" podID="08a409b5-356f-4044-b624-bc95b687b192" containerID="e2af21f159e1d868f792232886e3c528be23e465e8c657ee5de3704d924f4f7b" exitCode=0 Nov 26 12:22:38 crc kubenswrapper[4834]: I1126 12:22:38.053195 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerDied","Data":"e2af21f159e1d868f792232886e3c528be23e465e8c657ee5de3704d924f4f7b"} Nov 26 12:22:38 crc kubenswrapper[4834]: I1126 12:22:38.055938 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" event={"ID":"18f9e616-aa50-4bbb-adf2-d635f4ed3b0b","Type":"ContainerStarted","Data":"02afac8301d1af34b52e4399814f6411e0b41ec71e33c6d7a28df8ea2ccc63ca"} Nov 26 12:22:38 crc kubenswrapper[4834]: I1126 12:22:38.056130 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:38 crc kubenswrapper[4834]: I1126 12:22:38.088995 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" podStartSLOduration=1.164140218 podStartE2EDuration="7.088965497s" podCreationTimestamp="2025-11-26 12:22:31 +0000 UTC" firstStartedPulling="2025-11-26 12:22:31.827087156 +0000 UTC m=+649.734300508" lastFinishedPulling="2025-11-26 12:22:37.751912434 +0000 UTC m=+655.659125787" observedRunningTime="2025-11-26 12:22:38.084038759 +0000 UTC m=+655.991252111" watchObservedRunningTime="2025-11-26 12:22:38.088965497 +0000 UTC m=+655.996178849" Nov 26 12:22:39 crc kubenswrapper[4834]: I1126 12:22:39.063450 4834 generic.go:334] "Generic (PLEG): container finished" podID="08a409b5-356f-4044-b624-bc95b687b192" containerID="e546a89fcb78452304221f7020ae9ed377171efbb03fbb16a37682277f9be3be" exitCode=0 Nov 26 12:22:39 crc kubenswrapper[4834]: I1126 12:22:39.063559 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerDied","Data":"e546a89fcb78452304221f7020ae9ed377171efbb03fbb16a37682277f9be3be"} Nov 26 12:22:40 crc kubenswrapper[4834]: I1126 12:22:40.072885 4834 generic.go:334] "Generic (PLEG): container finished" podID="08a409b5-356f-4044-b624-bc95b687b192" containerID="5ac052a4c0c71c8d7ebefbd2252c487712e8e115b7325231e208c6edd60b8a3d" exitCode=0 Nov 26 12:22:40 crc kubenswrapper[4834]: I1126 12:22:40.072924 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerDied","Data":"5ac052a4c0c71c8d7ebefbd2252c487712e8e115b7325231e208c6edd60b8a3d"} Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.085517 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerStarted","Data":"76f79f54bacac711ac89ae132609ba0c8dee614843c99269b322990ef5443e8c"} Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.086002 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.086019 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerStarted","Data":"e663e252b5e2cc298e930e361f9c9fc89311c16b794b9a5dd9dab8c0652b45cb"} Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.086034 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerStarted","Data":"9b84c6c9ec6d174c949cfe2c961dee8d514d1a1c725b4af920bc34ec04108077"} Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.086043 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerStarted","Data":"ee1da74352a260dd3dc49f014c393193c265dd8b66355d22155c059fde68c159"} Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.086055 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerStarted","Data":"c36f72b737c9d4d656b60a0458cbe83d06c1e38ed88a3a045dd616f0d99c4ef1"} Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.086063 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9bcr6" event={"ID":"08a409b5-356f-4044-b624-bc95b687b192","Type":"ContainerStarted","Data":"c4cf24d44e99694da38053167a9503b173cc01bda908873393a7fc7a8e00f5b6"} Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.110103 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9bcr6" podStartSLOduration=3.929918646 podStartE2EDuration="10.11007447s" podCreationTimestamp="2025-11-26 12:22:31 +0000 UTC" firstStartedPulling="2025-11-26 12:22:31.55507445 +0000 UTC m=+649.462287802" lastFinishedPulling="2025-11-26 12:22:37.735230274 +0000 UTC m=+655.642443626" observedRunningTime="2025-11-26 12:22:41.10604292 +0000 UTC m=+659.013256272" watchObservedRunningTime="2025-11-26 12:22:41.11007447 +0000 UTC m=+659.017287822" Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.419879 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:41 crc kubenswrapper[4834]: I1126 12:22:41.453592 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:51 crc kubenswrapper[4834]: I1126 12:22:51.421701 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9bcr6" Nov 26 12:22:51 crc kubenswrapper[4834]: I1126 12:22:51.429934 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-jwdtf" Nov 26 12:22:51 crc kubenswrapper[4834]: I1126 12:22:51.515191 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-fsv98" Nov 26 12:22:53 crc kubenswrapper[4834]: I1126 12:22:53.002014 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gkhb7" Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.153362 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vm749"] Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.153962 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vm749" Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.155396 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.155576 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.155724 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-ps48f" Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.166985 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vm749"] Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.296096 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfz9d\" (UniqueName: \"kubernetes.io/projected/975c0d1c-2e73-4dfd-89c1-1c225d0268a0-kube-api-access-cfz9d\") pod \"openstack-operator-index-vm749\" (UID: \"975c0d1c-2e73-4dfd-89c1-1c225d0268a0\") " pod="openstack-operators/openstack-operator-index-vm749" Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.397096 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfz9d\" (UniqueName: \"kubernetes.io/projected/975c0d1c-2e73-4dfd-89c1-1c225d0268a0-kube-api-access-cfz9d\") pod \"openstack-operator-index-vm749\" (UID: \"975c0d1c-2e73-4dfd-89c1-1c225d0268a0\") " pod="openstack-operators/openstack-operator-index-vm749" Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.416200 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfz9d\" (UniqueName: \"kubernetes.io/projected/975c0d1c-2e73-4dfd-89c1-1c225d0268a0-kube-api-access-cfz9d\") pod \"openstack-operator-index-vm749\" (UID: \"975c0d1c-2e73-4dfd-89c1-1c225d0268a0\") " pod="openstack-operators/openstack-operator-index-vm749" Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.470904 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vm749" Nov 26 12:22:55 crc kubenswrapper[4834]: I1126 12:22:55.839453 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vm749"] Nov 26 12:22:55 crc kubenswrapper[4834]: W1126 12:22:55.846232 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod975c0d1c_2e73_4dfd_89c1_1c225d0268a0.slice/crio-74e7b76dbd81d828f34c1db951e7a40ac66db316417d9040b7f8ab9c12354b45 WatchSource:0}: Error finding container 74e7b76dbd81d828f34c1db951e7a40ac66db316417d9040b7f8ab9c12354b45: Status 404 returned error can't find the container with id 74e7b76dbd81d828f34c1db951e7a40ac66db316417d9040b7f8ab9c12354b45 Nov 26 12:22:56 crc kubenswrapper[4834]: I1126 12:22:56.170178 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vm749" event={"ID":"975c0d1c-2e73-4dfd-89c1-1c225d0268a0","Type":"ContainerStarted","Data":"74e7b76dbd81d828f34c1db951e7a40ac66db316417d9040b7f8ab9c12354b45"} Nov 26 12:22:58 crc kubenswrapper[4834]: I1126 12:22:58.183848 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vm749" event={"ID":"975c0d1c-2e73-4dfd-89c1-1c225d0268a0","Type":"ContainerStarted","Data":"5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22"} Nov 26 12:22:58 crc kubenswrapper[4834]: I1126 12:22:58.207407 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vm749" podStartSLOduration=1.568192867 podStartE2EDuration="3.207381411s" podCreationTimestamp="2025-11-26 12:22:55 +0000 UTC" firstStartedPulling="2025-11-26 12:22:55.848401794 +0000 UTC m=+673.755615135" lastFinishedPulling="2025-11-26 12:22:57.487590317 +0000 UTC m=+675.394803679" observedRunningTime="2025-11-26 12:22:58.204798672 +0000 UTC m=+676.112012024" watchObservedRunningTime="2025-11-26 12:22:58.207381411 +0000 UTC m=+676.114594764" Nov 26 12:22:58 crc kubenswrapper[4834]: I1126 12:22:58.341999 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vm749"] Nov 26 12:22:58 crc kubenswrapper[4834]: I1126 12:22:58.943242 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-x8kqp"] Nov 26 12:22:58 crc kubenswrapper[4834]: I1126 12:22:58.944120 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x8kqp" Nov 26 12:22:58 crc kubenswrapper[4834]: I1126 12:22:58.951241 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x8kqp"] Nov 26 12:22:59 crc kubenswrapper[4834]: I1126 12:22:59.045491 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgb8n\" (UniqueName: \"kubernetes.io/projected/bf2e56f9-16d2-4807-9650-87625dcfacd2-kube-api-access-wgb8n\") pod \"openstack-operator-index-x8kqp\" (UID: \"bf2e56f9-16d2-4807-9650-87625dcfacd2\") " pod="openstack-operators/openstack-operator-index-x8kqp" Nov 26 12:22:59 crc kubenswrapper[4834]: I1126 12:22:59.147684 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgb8n\" (UniqueName: \"kubernetes.io/projected/bf2e56f9-16d2-4807-9650-87625dcfacd2-kube-api-access-wgb8n\") pod \"openstack-operator-index-x8kqp\" (UID: \"bf2e56f9-16d2-4807-9650-87625dcfacd2\") " pod="openstack-operators/openstack-operator-index-x8kqp" Nov 26 12:22:59 crc kubenswrapper[4834]: I1126 12:22:59.164770 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgb8n\" (UniqueName: \"kubernetes.io/projected/bf2e56f9-16d2-4807-9650-87625dcfacd2-kube-api-access-wgb8n\") pod \"openstack-operator-index-x8kqp\" (UID: \"bf2e56f9-16d2-4807-9650-87625dcfacd2\") " pod="openstack-operators/openstack-operator-index-x8kqp" Nov 26 12:22:59 crc kubenswrapper[4834]: I1126 12:22:59.258716 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x8kqp" Nov 26 12:22:59 crc kubenswrapper[4834]: I1126 12:22:59.621719 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x8kqp"] Nov 26 12:22:59 crc kubenswrapper[4834]: W1126 12:22:59.625901 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2e56f9_16d2_4807_9650_87625dcfacd2.slice/crio-dad7b894a7d79035b324535addcd8fc80922899e8c17708d5acbbd01e1508f9b WatchSource:0}: Error finding container dad7b894a7d79035b324535addcd8fc80922899e8c17708d5acbbd01e1508f9b: Status 404 returned error can't find the container with id dad7b894a7d79035b324535addcd8fc80922899e8c17708d5acbbd01e1508f9b Nov 26 12:23:00 crc kubenswrapper[4834]: I1126 12:23:00.200256 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x8kqp" event={"ID":"bf2e56f9-16d2-4807-9650-87625dcfacd2","Type":"ContainerStarted","Data":"dad7b894a7d79035b324535addcd8fc80922899e8c17708d5acbbd01e1508f9b"} Nov 26 12:23:00 crc kubenswrapper[4834]: I1126 12:23:00.200425 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-vm749" podUID="975c0d1c-2e73-4dfd-89c1-1c225d0268a0" containerName="registry-server" containerID="cri-o://5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22" gracePeriod=2 Nov 26 12:23:00 crc kubenswrapper[4834]: I1126 12:23:00.496087 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vm749" Nov 26 12:23:00 crc kubenswrapper[4834]: I1126 12:23:00.665176 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfz9d\" (UniqueName: \"kubernetes.io/projected/975c0d1c-2e73-4dfd-89c1-1c225d0268a0-kube-api-access-cfz9d\") pod \"975c0d1c-2e73-4dfd-89c1-1c225d0268a0\" (UID: \"975c0d1c-2e73-4dfd-89c1-1c225d0268a0\") " Nov 26 12:23:00 crc kubenswrapper[4834]: I1126 12:23:00.686846 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975c0d1c-2e73-4dfd-89c1-1c225d0268a0-kube-api-access-cfz9d" (OuterVolumeSpecName: "kube-api-access-cfz9d") pod "975c0d1c-2e73-4dfd-89c1-1c225d0268a0" (UID: "975c0d1c-2e73-4dfd-89c1-1c225d0268a0"). InnerVolumeSpecName "kube-api-access-cfz9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:23:00 crc kubenswrapper[4834]: I1126 12:23:00.767154 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfz9d\" (UniqueName: \"kubernetes.io/projected/975c0d1c-2e73-4dfd-89c1-1c225d0268a0-kube-api-access-cfz9d\") on node \"crc\" DevicePath \"\"" Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.210385 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x8kqp" event={"ID":"bf2e56f9-16d2-4807-9650-87625dcfacd2","Type":"ContainerStarted","Data":"f2154c84547e264133393613008884afff441c4af038f63e7cb2dcf53bf2cc10"} Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.212122 4834 generic.go:334] "Generic (PLEG): container finished" podID="975c0d1c-2e73-4dfd-89c1-1c225d0268a0" containerID="5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22" exitCode=0 Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.212195 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vm749" event={"ID":"975c0d1c-2e73-4dfd-89c1-1c225d0268a0","Type":"ContainerDied","Data":"5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22"} Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.212250 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vm749" event={"ID":"975c0d1c-2e73-4dfd-89c1-1c225d0268a0","Type":"ContainerDied","Data":"74e7b76dbd81d828f34c1db951e7a40ac66db316417d9040b7f8ab9c12354b45"} Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.212368 4834 scope.go:117] "RemoveContainer" containerID="5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22" Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.212541 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vm749" Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.225744 4834 scope.go:117] "RemoveContainer" containerID="5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22" Nov 26 12:23:01 crc kubenswrapper[4834]: E1126 12:23:01.226123 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22\": container with ID starting with 5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22 not found: ID does not exist" containerID="5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22" Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.226166 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22"} err="failed to get container status \"5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22\": rpc error: code = NotFound desc = could not find container \"5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22\": container with ID starting with 5aa3b04e4fac28dc0cbbcc4e55103bcea0b00cf8e14065a44134dc702aa6ba22 not found: ID does not exist" Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.236525 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-x8kqp" podStartSLOduration=2.748284188 podStartE2EDuration="3.236510415s" podCreationTimestamp="2025-11-26 12:22:58 +0000 UTC" firstStartedPulling="2025-11-26 12:22:59.629165716 +0000 UTC m=+677.536379068" lastFinishedPulling="2025-11-26 12:23:00.117391943 +0000 UTC m=+678.024605295" observedRunningTime="2025-11-26 12:23:01.223966055 +0000 UTC m=+679.131179407" watchObservedRunningTime="2025-11-26 12:23:01.236510415 +0000 UTC m=+679.143723766" Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.238412 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vm749"] Nov 26 12:23:01 crc kubenswrapper[4834]: I1126 12:23:01.241361 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-vm749"] Nov 26 12:23:02 crc kubenswrapper[4834]: I1126 12:23:02.425049 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975c0d1c-2e73-4dfd-89c1-1c225d0268a0" path="/var/lib/kubelet/pods/975c0d1c-2e73-4dfd-89c1-1c225d0268a0/volumes" Nov 26 12:23:09 crc kubenswrapper[4834]: I1126 12:23:09.258892 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-x8kqp" Nov 26 12:23:09 crc kubenswrapper[4834]: I1126 12:23:09.259402 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-x8kqp" Nov 26 12:23:09 crc kubenswrapper[4834]: I1126 12:23:09.282789 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-x8kqp" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.297502 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-x8kqp" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.783402 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf"] Nov 26 12:23:10 crc kubenswrapper[4834]: E1126 12:23:10.783652 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975c0d1c-2e73-4dfd-89c1-1c225d0268a0" containerName="registry-server" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.783670 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="975c0d1c-2e73-4dfd-89c1-1c225d0268a0" containerName="registry-server" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.783776 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="975c0d1c-2e73-4dfd-89c1-1c225d0268a0" containerName="registry-server" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.784483 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.786083 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nnzmt" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.791741 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf"] Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.810834 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-util\") pod \"3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.810867 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fgzh\" (UniqueName: \"kubernetes.io/projected/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-kube-api-access-9fgzh\") pod \"3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.810889 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-bundle\") pod \"3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.912672 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-util\") pod \"3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.912736 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fgzh\" (UniqueName: \"kubernetes.io/projected/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-kube-api-access-9fgzh\") pod \"3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.912779 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-bundle\") pod \"3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.913507 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-bundle\") pod \"3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.913593 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-util\") pod \"3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:10 crc kubenswrapper[4834]: I1126 12:23:10.932626 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fgzh\" (UniqueName: \"kubernetes.io/projected/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-kube-api-access-9fgzh\") pod \"3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:11 crc kubenswrapper[4834]: I1126 12:23:11.099218 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:11 crc kubenswrapper[4834]: I1126 12:23:11.258502 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf"] Nov 26 12:23:11 crc kubenswrapper[4834]: I1126 12:23:11.283770 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" event={"ID":"7fa1d84c-4b0c-45bf-881d-01d0a25ea746","Type":"ContainerStarted","Data":"0433192f4eaff477b9682617d4cbc85b39cbcfe167487d816f5d8e8b4ce7305f"} Nov 26 12:23:12 crc kubenswrapper[4834]: I1126 12:23:12.296274 4834 generic.go:334] "Generic (PLEG): container finished" podID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerID="20ce032f7068ea71ad2bb3aadbb0c0fd0231965d1a36769812f8a3acad0054f1" exitCode=0 Nov 26 12:23:12 crc kubenswrapper[4834]: I1126 12:23:12.296348 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" event={"ID":"7fa1d84c-4b0c-45bf-881d-01d0a25ea746","Type":"ContainerDied","Data":"20ce032f7068ea71ad2bb3aadbb0c0fd0231965d1a36769812f8a3acad0054f1"} Nov 26 12:23:14 crc kubenswrapper[4834]: I1126 12:23:14.310604 4834 generic.go:334] "Generic (PLEG): container finished" podID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerID="ef84a577ca0213d5eea74e4d455b97c8949abeeee4525f3f2607656d70182c7b" exitCode=0 Nov 26 12:23:14 crc kubenswrapper[4834]: I1126 12:23:14.311192 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" event={"ID":"7fa1d84c-4b0c-45bf-881d-01d0a25ea746","Type":"ContainerDied","Data":"ef84a577ca0213d5eea74e4d455b97c8949abeeee4525f3f2607656d70182c7b"} Nov 26 12:23:15 crc kubenswrapper[4834]: I1126 12:23:15.322812 4834 generic.go:334] "Generic (PLEG): container finished" podID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerID="df7a7ca8b900d985bed1e5da46e9e5f19fb15d1b2a2887168004429170a82e2d" exitCode=0 Nov 26 12:23:15 crc kubenswrapper[4834]: I1126 12:23:15.322856 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" event={"ID":"7fa1d84c-4b0c-45bf-881d-01d0a25ea746","Type":"ContainerDied","Data":"df7a7ca8b900d985bed1e5da46e9e5f19fb15d1b2a2887168004429170a82e2d"} Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.522027 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.687182 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fgzh\" (UniqueName: \"kubernetes.io/projected/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-kube-api-access-9fgzh\") pod \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.687830 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-bundle\") pod \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.688000 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-util\") pod \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\" (UID: \"7fa1d84c-4b0c-45bf-881d-01d0a25ea746\") " Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.689090 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-bundle" (OuterVolumeSpecName: "bundle") pod "7fa1d84c-4b0c-45bf-881d-01d0a25ea746" (UID: "7fa1d84c-4b0c-45bf-881d-01d0a25ea746"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.692214 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-kube-api-access-9fgzh" (OuterVolumeSpecName: "kube-api-access-9fgzh") pod "7fa1d84c-4b0c-45bf-881d-01d0a25ea746" (UID: "7fa1d84c-4b0c-45bf-881d-01d0a25ea746"). InnerVolumeSpecName "kube-api-access-9fgzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.698370 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-util" (OuterVolumeSpecName: "util") pod "7fa1d84c-4b0c-45bf-881d-01d0a25ea746" (UID: "7fa1d84c-4b0c-45bf-881d-01d0a25ea746"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.790068 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-util\") on node \"crc\" DevicePath \"\"" Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.790100 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fgzh\" (UniqueName: \"kubernetes.io/projected/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-kube-api-access-9fgzh\") on node \"crc\" DevicePath \"\"" Nov 26 12:23:16 crc kubenswrapper[4834]: I1126 12:23:16.790116 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fa1d84c-4b0c-45bf-881d-01d0a25ea746-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:23:17 crc kubenswrapper[4834]: I1126 12:23:17.333909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" event={"ID":"7fa1d84c-4b0c-45bf-881d-01d0a25ea746","Type":"ContainerDied","Data":"0433192f4eaff477b9682617d4cbc85b39cbcfe167487d816f5d8e8b4ce7305f"} Nov 26 12:23:17 crc kubenswrapper[4834]: I1126 12:23:17.334169 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0433192f4eaff477b9682617d4cbc85b39cbcfe167487d816f5d8e8b4ce7305f" Nov 26 12:23:17 crc kubenswrapper[4834]: I1126 12:23:17.334012 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf" Nov 26 12:23:21 crc kubenswrapper[4834]: I1126 12:23:21.531400 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:23:21 crc kubenswrapper[4834]: I1126 12:23:21.531884 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.197394 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2"] Nov 26 12:23:23 crc kubenswrapper[4834]: E1126 12:23:23.197596 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerName="pull" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.197607 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerName="pull" Nov 26 12:23:23 crc kubenswrapper[4834]: E1126 12:23:23.197615 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerName="extract" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.197621 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerName="extract" Nov 26 12:23:23 crc kubenswrapper[4834]: E1126 12:23:23.197631 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerName="util" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.197637 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerName="util" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.197732 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa1d84c-4b0c-45bf-881d-01d0a25ea746" containerName="extract" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.198051 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.200367 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-htnv9" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.217462 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2"] Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.268577 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmt76\" (UniqueName: \"kubernetes.io/projected/28ea2d78-52e8-4081-9324-3b3c7acb0c34-kube-api-access-rmt76\") pod \"openstack-operator-controller-operator-544fb75865-94vd2\" (UID: \"28ea2d78-52e8-4081-9324-3b3c7acb0c34\") " pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.373267 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmt76\" (UniqueName: \"kubernetes.io/projected/28ea2d78-52e8-4081-9324-3b3c7acb0c34-kube-api-access-rmt76\") pod \"openstack-operator-controller-operator-544fb75865-94vd2\" (UID: \"28ea2d78-52e8-4081-9324-3b3c7acb0c34\") " pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.390022 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmt76\" (UniqueName: \"kubernetes.io/projected/28ea2d78-52e8-4081-9324-3b3c7acb0c34-kube-api-access-rmt76\") pod \"openstack-operator-controller-operator-544fb75865-94vd2\" (UID: \"28ea2d78-52e8-4081-9324-3b3c7acb0c34\") " pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.511349 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" Nov 26 12:23:23 crc kubenswrapper[4834]: I1126 12:23:23.922882 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2"] Nov 26 12:23:24 crc kubenswrapper[4834]: I1126 12:23:24.385910 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" event={"ID":"28ea2d78-52e8-4081-9324-3b3c7acb0c34","Type":"ContainerStarted","Data":"152bd366bfd49654fc0220cb329a454819cdb7f7c3fcc8426bc1a8207a949d3d"} Nov 26 12:23:28 crc kubenswrapper[4834]: I1126 12:23:28.411880 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" event={"ID":"28ea2d78-52e8-4081-9324-3b3c7acb0c34","Type":"ContainerStarted","Data":"c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5"} Nov 26 12:23:28 crc kubenswrapper[4834]: I1126 12:23:28.449818 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" podStartSLOduration=2.048157452 podStartE2EDuration="5.449783279s" podCreationTimestamp="2025-11-26 12:23:23 +0000 UTC" firstStartedPulling="2025-11-26 12:23:23.931942114 +0000 UTC m=+701.839155456" lastFinishedPulling="2025-11-26 12:23:27.333567931 +0000 UTC m=+705.240781283" observedRunningTime="2025-11-26 12:23:28.436735551 +0000 UTC m=+706.343948903" watchObservedRunningTime="2025-11-26 12:23:28.449783279 +0000 UTC m=+706.356996632" Nov 26 12:23:29 crc kubenswrapper[4834]: I1126 12:23:29.417762 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" Nov 26 12:23:33 crc kubenswrapper[4834]: I1126 12:23:33.513957 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.697916 4834 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.821486 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.822448 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.823618 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-fmqh4" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.824951 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.825864 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.827398 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-jts8j" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.827916 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.834988 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpx7b\" (UniqueName: \"kubernetes.io/projected/90b33e2e-c7dc-4e9e-b479-dca5251277bc-kube-api-access-zpx7b\") pod \"barbican-operator-controller-manager-7b64f4fb85-c5rjq\" (UID: \"90b33e2e-c7dc-4e9e-b479-dca5251277bc\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.835060 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzh5\" (UniqueName: \"kubernetes.io/projected/d85e5da5-d129-4904-8bde-6ff4bb92614f-kube-api-access-6pzh5\") pod \"cinder-operator-controller-manager-6b7f75547b-m4bdb\" (UID: \"d85e5da5-d129-4904-8bde-6ff4bb92614f\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.835338 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.860122 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.861058 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.862980 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-7vd97"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.863799 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.864622 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-94wcp" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.873277 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-fkrx8" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.877498 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.878242 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.880250 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-m4fk7" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.883020 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.885661 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-7vd97"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.888120 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.888883 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.894856 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lqs4k" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.900302 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.902677 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-btq96"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.903516 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.906646 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dq8th" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.906795 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.911349 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.927136 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-btq96"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.937172 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpx7b\" (UniqueName: \"kubernetes.io/projected/90b33e2e-c7dc-4e9e-b479-dca5251277bc-kube-api-access-zpx7b\") pod \"barbican-operator-controller-manager-7b64f4fb85-c5rjq\" (UID: \"90b33e2e-c7dc-4e9e-b479-dca5251277bc\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.937214 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfscx\" (UniqueName: \"kubernetes.io/projected/dc55c2f3-1e8f-48ac-9d6d-581737e07566-kube-api-access-wfscx\") pod \"horizon-operator-controller-manager-5d494799bf-h4t44\" (UID: \"dc55c2f3-1e8f-48ac-9d6d-581737e07566\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.937263 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr828\" (UniqueName: \"kubernetes.io/projected/7b521baa-5390-41d7-8654-7b556346833d-kube-api-access-lr828\") pod \"heat-operator-controller-manager-5b77f656f-zbptq\" (UID: \"7b521baa-5390-41d7-8654-7b556346833d\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.937277 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.937331 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5b6m\" (UniqueName: \"kubernetes.io/projected/7f07c17d-8260-47a7-b1e1-0f16226838a7-kube-api-access-h5b6m\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.937353 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp5rk\" (UniqueName: \"kubernetes.io/projected/c476edc2-bbe3-4dca-a1fc-9a9c95f758c3-kube-api-access-sp5rk\") pod \"glance-operator-controller-manager-589cbd6b5b-st8n9\" (UID: \"c476edc2-bbe3-4dca-a1fc-9a9c95f758c3\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.937393 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pzh5\" (UniqueName: \"kubernetes.io/projected/d85e5da5-d129-4904-8bde-6ff4bb92614f-kube-api-access-6pzh5\") pod \"cinder-operator-controller-manager-6b7f75547b-m4bdb\" (UID: \"d85e5da5-d129-4904-8bde-6ff4bb92614f\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.937423 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc89t\" (UniqueName: \"kubernetes.io/projected/702bd8a5-fc1e-4ee9-b85b-01ea9d177a97-kube-api-access-fc89t\") pod \"designate-operator-controller-manager-955677c94-7vd97\" (UID: \"702bd8a5-fc1e-4ee9-b85b-01ea9d177a97\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.943538 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.944432 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.945909 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kwn5m" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.955705 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.965289 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pzh5\" (UniqueName: \"kubernetes.io/projected/d85e5da5-d129-4904-8bde-6ff4bb92614f-kube-api-access-6pzh5\") pod \"cinder-operator-controller-manager-6b7f75547b-m4bdb\" (UID: \"d85e5da5-d129-4904-8bde-6ff4bb92614f\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.969207 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpx7b\" (UniqueName: \"kubernetes.io/projected/90b33e2e-c7dc-4e9e-b479-dca5251277bc-kube-api-access-zpx7b\") pod \"barbican-operator-controller-manager-7b64f4fb85-c5rjq\" (UID: \"90b33e2e-c7dc-4e9e-b479-dca5251277bc\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.980228 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.981183 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.983404 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.984369 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.984885 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dlbs9" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.986944 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc"] Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.987256 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2ftv5" Nov 26 12:23:50 crc kubenswrapper[4834]: I1126 12:23:50.999496 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.018820 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.026915 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-m8fdr" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.035074 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.038739 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfscx\" (UniqueName: \"kubernetes.io/projected/dc55c2f3-1e8f-48ac-9d6d-581737e07566-kube-api-access-wfscx\") pod \"horizon-operator-controller-manager-5d494799bf-h4t44\" (UID: \"dc55c2f3-1e8f-48ac-9d6d-581737e07566\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.038781 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.038800 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr828\" (UniqueName: \"kubernetes.io/projected/7b521baa-5390-41d7-8654-7b556346833d-kube-api-access-lr828\") pod \"heat-operator-controller-manager-5b77f656f-zbptq\" (UID: \"7b521baa-5390-41d7-8654-7b556346833d\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.038823 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5b6m\" (UniqueName: \"kubernetes.io/projected/7f07c17d-8260-47a7-b1e1-0f16226838a7-kube-api-access-h5b6m\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.038844 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5rk\" (UniqueName: \"kubernetes.io/projected/c476edc2-bbe3-4dca-a1fc-9a9c95f758c3-kube-api-access-sp5rk\") pod \"glance-operator-controller-manager-589cbd6b5b-st8n9\" (UID: \"c476edc2-bbe3-4dca-a1fc-9a9c95f758c3\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.038878 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc89t\" (UniqueName: \"kubernetes.io/projected/702bd8a5-fc1e-4ee9-b85b-01ea9d177a97-kube-api-access-fc89t\") pod \"designate-operator-controller-manager-955677c94-7vd97\" (UID: \"702bd8a5-fc1e-4ee9-b85b-01ea9d177a97\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.039667 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.039704 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert podName:7f07c17d-8260-47a7-b1e1-0f16226838a7 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:51.539691899 +0000 UTC m=+729.446905250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert") pod "infra-operator-controller-manager-57548d458d-btq96" (UID: "7f07c17d-8260-47a7-b1e1-0f16226838a7") : secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.044831 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.046110 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.048676 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-t9vh7" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.051837 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.062874 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp5rk\" (UniqueName: \"kubernetes.io/projected/c476edc2-bbe3-4dca-a1fc-9a9c95f758c3-kube-api-access-sp5rk\") pod \"glance-operator-controller-manager-589cbd6b5b-st8n9\" (UID: \"c476edc2-bbe3-4dca-a1fc-9a9c95f758c3\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.063296 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfscx\" (UniqueName: \"kubernetes.io/projected/dc55c2f3-1e8f-48ac-9d6d-581737e07566-kube-api-access-wfscx\") pod \"horizon-operator-controller-manager-5d494799bf-h4t44\" (UID: \"dc55c2f3-1e8f-48ac-9d6d-581737e07566\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.070902 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc89t\" (UniqueName: \"kubernetes.io/projected/702bd8a5-fc1e-4ee9-b85b-01ea9d177a97-kube-api-access-fc89t\") pod \"designate-operator-controller-manager-955677c94-7vd97\" (UID: \"702bd8a5-fc1e-4ee9-b85b-01ea9d177a97\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.071351 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr828\" (UniqueName: \"kubernetes.io/projected/7b521baa-5390-41d7-8654-7b556346833d-kube-api-access-lr828\") pod \"heat-operator-controller-manager-5b77f656f-zbptq\" (UID: \"7b521baa-5390-41d7-8654-7b556346833d\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.076580 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5b6m\" (UniqueName: \"kubernetes.io/projected/7f07c17d-8260-47a7-b1e1-0f16226838a7-kube-api-access-h5b6m\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.087370 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.091527 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.092475 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.094442 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-zzcwk" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.099527 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.101164 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.105783 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wrp9s" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.105981 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.111516 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.121722 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.122655 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.124888 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-kxgjk" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.126138 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.127095 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.128356 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.128408 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5t8bb" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.129870 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.138746 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.139365 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgch\" (UniqueName: \"kubernetes.io/projected/a6b8f0bf-a405-4b2a-91b2-1934cd2997b2-kube-api-access-tpgch\") pod \"manila-operator-controller-manager-5d499bf58b-ffwjp\" (UID: \"a6b8f0bf-a405-4b2a-91b2-1934cd2997b2\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.139402 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv6qj\" (UniqueName: \"kubernetes.io/projected/8911dae6-36bd-410e-847a-c7c7134bb5a4-kube-api-access-gv6qj\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-6phws\" (UID: \"8911dae6-36bd-410e-847a-c7c7134bb5a4\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.139426 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d7kg\" (UniqueName: \"kubernetes.io/projected/c6c80d48-ce10-48bd-8cfb-67db8079dc1b-kube-api-access-6d7kg\") pod \"keystone-operator-controller-manager-7b4567c7cf-2gfhc\" (UID: \"c6c80d48-ce10-48bd-8cfb-67db8079dc1b\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.139472 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbh7v\" (UniqueName: \"kubernetes.io/projected/2b31b20d-186a-4fb2-bfa2-914e5eda233e-kube-api-access-nbh7v\") pod \"ironic-operator-controller-manager-67cb4dc6d4-xf2w7\" (UID: \"2b31b20d-186a-4fb2-bfa2-914e5eda233e\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.139547 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.140394 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.141471 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nfchf" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.143352 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.145840 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.152336 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.173339 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.174668 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.175979 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-vznng" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.176417 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.180769 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.186874 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.196288 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.207722 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.241984 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crrjj\" (UniqueName: \"kubernetes.io/projected/39e5784b-2de7-45cb-9741-a0840599fb52-kube-api-access-crrjj\") pod \"placement-operator-controller-manager-57988cc5b5-ppdpl\" (UID: \"39e5784b-2de7-45cb-9741-a0840599fb52\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbh7v\" (UniqueName: \"kubernetes.io/projected/2b31b20d-186a-4fb2-bfa2-914e5eda233e-kube-api-access-nbh7v\") pod \"ironic-operator-controller-manager-67cb4dc6d4-xf2w7\" (UID: \"2b31b20d-186a-4fb2-bfa2-914e5eda233e\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242225 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wvm9\" (UniqueName: \"kubernetes.io/projected/e37121ad-deca-4553-ae91-2a61eb0f9aac-kube-api-access-5wvm9\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242342 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-275zp\" (UniqueName: \"kubernetes.io/projected/657e5afd-3aba-4acf-a85d-36ef32e8c5f8-kube-api-access-275zp\") pod \"octavia-operator-controller-manager-64cdc6ff96-kcp4x\" (UID: \"657e5afd-3aba-4acf-a85d-36ef32e8c5f8\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242426 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgch\" (UniqueName: \"kubernetes.io/projected/a6b8f0bf-a405-4b2a-91b2-1934cd2997b2-kube-api-access-tpgch\") pod \"manila-operator-controller-manager-5d499bf58b-ffwjp\" (UID: \"a6b8f0bf-a405-4b2a-91b2-1934cd2997b2\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242504 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9tv7\" (UniqueName: \"kubernetes.io/projected/3811206c-bdee-4ea2-9f7c-7be96426f677-kube-api-access-g9tv7\") pod \"neutron-operator-controller-manager-6fdcddb789-7nh8j\" (UID: \"3811206c-bdee-4ea2-9f7c-7be96426f677\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242575 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mk4d\" (UniqueName: \"kubernetes.io/projected/796eea14-80b6-43fc-b682-1fdaf61253ee-kube-api-access-6mk4d\") pod \"ovn-operator-controller-manager-56897c768d-5s84k\" (UID: \"796eea14-80b6-43fc-b682-1fdaf61253ee\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242661 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqmsv\" (UniqueName: \"kubernetes.io/projected/14c2b899-bac2-43dd-844e-a66f4d75954a-kube-api-access-pqmsv\") pod \"nova-operator-controller-manager-79556f57fc-n4xj9\" (UID: \"14c2b899-bac2-43dd-844e-a66f4d75954a\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242732 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv6qj\" (UniqueName: \"kubernetes.io/projected/8911dae6-36bd-410e-847a-c7c7134bb5a4-kube-api-access-gv6qj\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-6phws\" (UID: \"8911dae6-36bd-410e-847a-c7c7134bb5a4\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242803 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.242875 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d7kg\" (UniqueName: \"kubernetes.io/projected/c6c80d48-ce10-48bd-8cfb-67db8079dc1b-kube-api-access-6d7kg\") pod \"keystone-operator-controller-manager-7b4567c7cf-2gfhc\" (UID: \"c6c80d48-ce10-48bd-8cfb-67db8079dc1b\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.271381 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.273360 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.277851 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d7kg\" (UniqueName: \"kubernetes.io/projected/c6c80d48-ce10-48bd-8cfb-67db8079dc1b-kube-api-access-6d7kg\") pod \"keystone-operator-controller-manager-7b4567c7cf-2gfhc\" (UID: \"c6c80d48-ce10-48bd-8cfb-67db8079dc1b\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.279991 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-k42bn" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.290964 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbh7v\" (UniqueName: \"kubernetes.io/projected/2b31b20d-186a-4fb2-bfa2-914e5eda233e-kube-api-access-nbh7v\") pod \"ironic-operator-controller-manager-67cb4dc6d4-xf2w7\" (UID: \"2b31b20d-186a-4fb2-bfa2-914e5eda233e\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.292813 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv6qj\" (UniqueName: \"kubernetes.io/projected/8911dae6-36bd-410e-847a-c7c7134bb5a4-kube-api-access-gv6qj\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-6phws\" (UID: \"8911dae6-36bd-410e-847a-c7c7134bb5a4\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.317118 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgch\" (UniqueName: \"kubernetes.io/projected/a6b8f0bf-a405-4b2a-91b2-1934cd2997b2-kube-api-access-tpgch\") pod \"manila-operator-controller-manager-5d499bf58b-ffwjp\" (UID: \"a6b8f0bf-a405-4b2a-91b2-1934cd2997b2\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.317524 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.334619 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.341127 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.345298 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crrjj\" (UniqueName: \"kubernetes.io/projected/39e5784b-2de7-45cb-9741-a0840599fb52-kube-api-access-crrjj\") pod \"placement-operator-controller-manager-57988cc5b5-ppdpl\" (UID: \"39e5784b-2de7-45cb-9741-a0840599fb52\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.345405 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wvm9\" (UniqueName: \"kubernetes.io/projected/e37121ad-deca-4553-ae91-2a61eb0f9aac-kube-api-access-5wvm9\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.345436 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-275zp\" (UniqueName: \"kubernetes.io/projected/657e5afd-3aba-4acf-a85d-36ef32e8c5f8-kube-api-access-275zp\") pod \"octavia-operator-controller-manager-64cdc6ff96-kcp4x\" (UID: \"657e5afd-3aba-4acf-a85d-36ef32e8c5f8\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.345464 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9tv7\" (UniqueName: \"kubernetes.io/projected/3811206c-bdee-4ea2-9f7c-7be96426f677-kube-api-access-g9tv7\") pod \"neutron-operator-controller-manager-6fdcddb789-7nh8j\" (UID: \"3811206c-bdee-4ea2-9f7c-7be96426f677\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.345483 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mk4d\" (UniqueName: \"kubernetes.io/projected/796eea14-80b6-43fc-b682-1fdaf61253ee-kube-api-access-6mk4d\") pod \"ovn-operator-controller-manager-56897c768d-5s84k\" (UID: \"796eea14-80b6-43fc-b682-1fdaf61253ee\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.345500 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqmsv\" (UniqueName: \"kubernetes.io/projected/14c2b899-bac2-43dd-844e-a66f4d75954a-kube-api-access-pqmsv\") pod \"nova-operator-controller-manager-79556f57fc-n4xj9\" (UID: \"14c2b899-bac2-43dd-844e-a66f4d75954a\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.345525 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.345551 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz7jg\" (UniqueName: \"kubernetes.io/projected/c9fbb19b-a6b3-47d2-b5c8-6574fa71c069-kube-api-access-kz7jg\") pod \"swift-operator-controller-manager-d77b94747-jtv6r\" (UID: \"c9fbb19b-a6b3-47d2-b5c8-6574fa71c069\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.346075 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.346121 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert podName:e37121ad-deca-4553-ae91-2a61eb0f9aac nodeName:}" failed. No retries permitted until 2025-11-26 12:23:51.846105767 +0000 UTC m=+729.753319119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert") pod "openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" (UID: "e37121ad-deca-4553-ae91-2a61eb0f9aac") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.367753 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.370937 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9tv7\" (UniqueName: \"kubernetes.io/projected/3811206c-bdee-4ea2-9f7c-7be96426f677-kube-api-access-g9tv7\") pod \"neutron-operator-controller-manager-6fdcddb789-7nh8j\" (UID: \"3811206c-bdee-4ea2-9f7c-7be96426f677\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.372868 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mk4d\" (UniqueName: \"kubernetes.io/projected/796eea14-80b6-43fc-b682-1fdaf61253ee-kube-api-access-6mk4d\") pod \"ovn-operator-controller-manager-56897c768d-5s84k\" (UID: \"796eea14-80b6-43fc-b682-1fdaf61253ee\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.374212 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wvm9\" (UniqueName: \"kubernetes.io/projected/e37121ad-deca-4553-ae91-2a61eb0f9aac-kube-api-access-5wvm9\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.374778 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-275zp\" (UniqueName: \"kubernetes.io/projected/657e5afd-3aba-4acf-a85d-36ef32e8c5f8-kube-api-access-275zp\") pod \"octavia-operator-controller-manager-64cdc6ff96-kcp4x\" (UID: \"657e5afd-3aba-4acf-a85d-36ef32e8c5f8\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.378977 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqmsv\" (UniqueName: \"kubernetes.io/projected/14c2b899-bac2-43dd-844e-a66f4d75954a-kube-api-access-pqmsv\") pod \"nova-operator-controller-manager-79556f57fc-n4xj9\" (UID: \"14c2b899-bac2-43dd-844e-a66f4d75954a\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.387013 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crrjj\" (UniqueName: \"kubernetes.io/projected/39e5784b-2de7-45cb-9741-a0840599fb52-kube-api-access-crrjj\") pod \"placement-operator-controller-manager-57988cc5b5-ppdpl\" (UID: \"39e5784b-2de7-45cb-9741-a0840599fb52\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.419542 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.427998 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.440655 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.443374 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.444606 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.446211 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8s9m\" (UniqueName: \"kubernetes.io/projected/6ab4e014-587f-483e-83dd-4e0327e17828-kube-api-access-l8s9m\") pod \"telemetry-operator-controller-manager-76cc84c6bb-wz75v\" (UID: \"6ab4e014-587f-483e-83dd-4e0327e17828\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.446259 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz7jg\" (UniqueName: \"kubernetes.io/projected/c9fbb19b-a6b3-47d2-b5c8-6574fa71c069-kube-api-access-kz7jg\") pod \"swift-operator-controller-manager-d77b94747-jtv6r\" (UID: \"c9fbb19b-a6b3-47d2-b5c8-6574fa71c069\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.447073 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-s65gs" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.455775 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.464067 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.467724 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.468785 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.473244 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.474886 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-l4mz5" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.475284 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz7jg\" (UniqueName: \"kubernetes.io/projected/c9fbb19b-a6b3-47d2-b5c8-6574fa71c069-kube-api-access-kz7jg\") pod \"swift-operator-controller-manager-d77b94747-jtv6r\" (UID: \"c9fbb19b-a6b3-47d2-b5c8-6574fa71c069\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.504792 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.531419 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.531459 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.533283 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.534041 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.537467 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.537635 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wkh8b" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.537650 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.549828 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.549912 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8s9m\" (UniqueName: \"kubernetes.io/projected/6ab4e014-587f-483e-83dd-4e0327e17828-kube-api-access-l8s9m\") pod \"telemetry-operator-controller-manager-76cc84c6bb-wz75v\" (UID: \"6ab4e014-587f-483e-83dd-4e0327e17828\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.549941 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h5nk\" (UniqueName: \"kubernetes.io/projected/ddae14f0-dc74-43ea-937c-144dd9ecdb62-kube-api-access-7h5nk\") pod \"test-operator-controller-manager-5cd6c7f4c8-7s4pw\" (UID: \"ddae14f0-dc74-43ea-937c-144dd9ecdb62\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.550036 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j"] Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.550358 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.550400 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert podName:7f07c17d-8260-47a7-b1e1-0f16226838a7 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:52.550383643 +0000 UTC m=+730.457596995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert") pod "infra-operator-controller-manager-57548d458d-btq96" (UID: "7f07c17d-8260-47a7-b1e1-0f16226838a7") : secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.560850 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.564388 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8s9m\" (UniqueName: \"kubernetes.io/projected/6ab4e014-587f-483e-83dd-4e0327e17828-kube-api-access-l8s9m\") pod \"telemetry-operator-controller-manager-76cc84c6bb-wz75v\" (UID: \"6ab4e014-587f-483e-83dd-4e0327e17828\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.612868 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.618423 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.631649 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.632586 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.637590 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-nzmbv" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.644704 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.651001 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjqr\" (UniqueName: \"kubernetes.io/projected/de8f1d07-0e85-4245-a378-51c81152ef64-kube-api-access-ktjqr\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.651068 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.651127 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ww62\" (UniqueName: \"kubernetes.io/projected/52aee873-a614-4018-aa3d-beb4021c29f6-kube-api-access-6ww62\") pod \"watcher-operator-controller-manager-656dcb59d4-dfcmd\" (UID: \"52aee873-a614-4018-aa3d-beb4021c29f6\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.651233 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h5nk\" (UniqueName: \"kubernetes.io/projected/ddae14f0-dc74-43ea-937c-144dd9ecdb62-kube-api-access-7h5nk\") pod \"test-operator-controller-manager-5cd6c7f4c8-7s4pw\" (UID: \"ddae14f0-dc74-43ea-937c-144dd9ecdb62\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.651300 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.662886 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.666960 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h5nk\" (UniqueName: \"kubernetes.io/projected/ddae14f0-dc74-43ea-937c-144dd9ecdb62-kube-api-access-7h5nk\") pod \"test-operator-controller-manager-5cd6c7f4c8-7s4pw\" (UID: \"ddae14f0-dc74-43ea-937c-144dd9ecdb62\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.667051 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq"] Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.752195 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.752249 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c58cf\" (UniqueName: \"kubernetes.io/projected/d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9-kube-api-access-c58cf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-qmpxl\" (UID: \"d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.752296 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjqr\" (UniqueName: \"kubernetes.io/projected/de8f1d07-0e85-4245-a378-51c81152ef64-kube-api-access-ktjqr\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.752349 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.752381 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ww62\" (UniqueName: \"kubernetes.io/projected/52aee873-a614-4018-aa3d-beb4021c29f6-kube-api-access-6ww62\") pod \"watcher-operator-controller-manager-656dcb59d4-dfcmd\" (UID: \"52aee873-a614-4018-aa3d-beb4021c29f6\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.752462 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.752611 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:52.252590183 +0000 UTC m=+730.159803536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "metrics-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.752798 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.752837 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:52.252823603 +0000 UTC m=+730.160036956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.764615 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.768389 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ww62\" (UniqueName: \"kubernetes.io/projected/52aee873-a614-4018-aa3d-beb4021c29f6-kube-api-access-6ww62\") pod \"watcher-operator-controller-manager-656dcb59d4-dfcmd\" (UID: \"52aee873-a614-4018-aa3d-beb4021c29f6\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.771698 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjqr\" (UniqueName: \"kubernetes.io/projected/de8f1d07-0e85-4245-a378-51c81152ef64-kube-api-access-ktjqr\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.793092 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.857091 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c58cf\" (UniqueName: \"kubernetes.io/projected/d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9-kube-api-access-c58cf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-qmpxl\" (UID: \"d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.857221 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.857426 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: E1126 12:23:51.857483 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert podName:e37121ad-deca-4553-ae91-2a61eb0f9aac nodeName:}" failed. No retries permitted until 2025-11-26 12:23:52.857469578 +0000 UTC m=+730.764682930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert") pod "openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" (UID: "e37121ad-deca-4553-ae91-2a61eb0f9aac") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.876411 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c58cf\" (UniqueName: \"kubernetes.io/projected/d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9-kube-api-access-c58cf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-qmpxl\" (UID: \"d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" Nov 26 12:23:51 crc kubenswrapper[4834]: I1126 12:23:51.975220 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.004835 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.011062 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-7vd97"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.018883 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.022184 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.140396 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44"] Nov 26 12:23:52 crc kubenswrapper[4834]: W1126 12:23:52.146452 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc55c2f3_1e8f_48ac_9d6d_581737e07566.slice/crio-141d2f4002fd68d248d5b3d9f7a6c9110e54784177047a4a13231465d3a72d38 WatchSource:0}: Error finding container 141d2f4002fd68d248d5b3d9f7a6c9110e54784177047a4a13231465d3a72d38: Status 404 returned error can't find the container with id 141d2f4002fd68d248d5b3d9f7a6c9110e54784177047a4a13231465d3a72d38 Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.200081 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.204489 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.228140 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl"] Nov 26 12:23:52 crc kubenswrapper[4834]: W1126 12:23:52.230521 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39e5784b_2de7_45cb_9741_a0840599fb52.slice/crio-68a8e9e336cd8a0f488dba710c9eaa8b08c410902b9e803c774ccde1cf207b14 WatchSource:0}: Error finding container 68a8e9e336cd8a0f488dba710c9eaa8b08c410902b9e803c774ccde1cf207b14: Status 404 returned error can't find the container with id 68a8e9e336cd8a0f488dba710c9eaa8b08c410902b9e803c774ccde1cf207b14 Nov 26 12:23:52 crc kubenswrapper[4834]: W1126 12:23:52.231417 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9fbb19b_a6b3_47d2_b5c8_6574fa71c069.slice/crio-3029e4adb98493f5fb143067a784319a64252c68fb0980b479578d634c0475d1 WatchSource:0}: Error finding container 3029e4adb98493f5fb143067a784319a64252c68fb0980b479578d634c0475d1: Status 404 returned error can't find the container with id 3029e4adb98493f5fb143067a784319a64252c68fb0980b479578d634c0475d1 Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.232346 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.236466 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.262991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.263081 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.263160 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.263209 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.263217 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:53.263201486 +0000 UTC m=+731.170414838 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "metrics-server-cert" not found Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.263277 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:53.263246221 +0000 UTC m=+731.170459573 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "webhook-server-cert" not found Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.358201 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.365105 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.378556 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v"] Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.389052 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-275zp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-64cdc6ff96-kcp4x_openstack-operators(657e5afd-3aba-4acf-a85d-36ef32e8c5f8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.390034 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l8s9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-wz75v_openstack-operators(6ab4e014-587f-483e-83dd-4e0327e17828): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.391422 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-275zp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-64cdc6ff96-kcp4x_openstack-operators(657e5afd-3aba-4acf-a85d-36ef32e8c5f8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.391940 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7"] Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.391640 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:d65dbfc956e9cf376f3c48fc3a0942cb7306b5164f898c40d1efca106df81db7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nbh7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-67cb4dc6d4-xf2w7_openstack-operators(2b31b20d-186a-4fb2-bfa2-914e5eda233e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.392345 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l8s9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-wz75v_openstack-operators(6ab4e014-587f-483e-83dd-4e0327e17828): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.392672 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" podUID="657e5afd-3aba-4acf-a85d-36ef32e8c5f8" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.393831 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" podUID="6ab4e014-587f-483e-83dd-4e0327e17828" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.394041 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nbh7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-67cb4dc6d4-xf2w7_openstack-operators(2b31b20d-186a-4fb2-bfa2-914e5eda233e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.396990 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" podUID="2b31b20d-186a-4fb2-bfa2-914e5eda233e" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.397398 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e00a9ed0ab26c5b745bd804ab1fe6b22428d026f17ea05a05f045e060342f46c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g9tv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6fdcddb789-7nh8j_openstack-operators(3811206c-bdee-4ea2-9f7c-7be96426f677): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.399508 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g9tv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6fdcddb789-7nh8j_openstack-operators(3811206c-bdee-4ea2-9f7c-7be96426f677): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.400525 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j"] Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.400594 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" podUID="3811206c-bdee-4ea2-9f7c-7be96426f677" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.473414 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd"] Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.481628 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw"] Nov 26 12:23:52 crc kubenswrapper[4834]: W1126 12:23:52.485878 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52aee873_a614_4018_aa3d_beb4021c29f6.slice/crio-71e6a13a5da3421b9299ae46feebe497bf7ad3a226143a4fffc34eb62e4a4a01 WatchSource:0}: Error finding container 71e6a13a5da3421b9299ae46feebe497bf7ad3a226143a4fffc34eb62e4a4a01: Status 404 returned error can't find the container with id 71e6a13a5da3421b9299ae46feebe497bf7ad3a226143a4fffc34eb62e4a4a01 Nov 26 12:23:52 crc kubenswrapper[4834]: W1126 12:23:52.490945 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddae14f0_dc74_43ea_937c_144dd9ecdb62.slice/crio-8a337750c600c62cbfda97ea641b9d2e202f0764c92674f025a4c4f2c83d0d68 WatchSource:0}: Error finding container 8a337750c600c62cbfda97ea641b9d2e202f0764c92674f025a4c4f2c83d0d68: Status 404 returned error can't find the container with id 8a337750c600c62cbfda97ea641b9d2e202f0764c92674f025a4c4f2c83d0d68 Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.494897 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl"] Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.501704 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7h5nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd6c7f4c8-7s4pw_openstack-operators(ddae14f0-dc74-43ea-937c-144dd9ecdb62): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.506699 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7h5nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd6c7f4c8-7s4pw_openstack-operators(ddae14f0-dc74-43ea-937c-144dd9ecdb62): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.507800 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" podUID="ddae14f0-dc74-43ea-937c-144dd9ecdb62" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.508060 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c58cf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-qmpxl_openstack-operators(d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.511434 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" podUID="d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.568478 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.568751 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.568818 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert podName:7f07c17d-8260-47a7-b1e1-0f16226838a7 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:54.568799286 +0000 UTC m=+732.476012638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert") pod "infra-operator-controller-manager-57548d458d-btq96" (UID: "7f07c17d-8260-47a7-b1e1-0f16226838a7") : secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.581282 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" event={"ID":"14c2b899-bac2-43dd-844e-a66f4d75954a","Type":"ContainerStarted","Data":"a956fff604824dd6e67a0aaa0d9c0c59d7c1c3773f67b85a52a2ce0a0161114e"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.582418 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" event={"ID":"52aee873-a614-4018-aa3d-beb4021c29f6","Type":"ContainerStarted","Data":"71e6a13a5da3421b9299ae46feebe497bf7ad3a226143a4fffc34eb62e4a4a01"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.583418 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" event={"ID":"a6b8f0bf-a405-4b2a-91b2-1934cd2997b2","Type":"ContainerStarted","Data":"f2695b320d9da8b353637a9d4eaf7885e773089e313736bf8ad9d585d7a69f47"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.584509 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" event={"ID":"2b31b20d-186a-4fb2-bfa2-914e5eda233e","Type":"ContainerStarted","Data":"241d280e3295efa4877eb1c4f67bfcd845cda1b65073b824955906174f6bd3b1"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.585274 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" event={"ID":"7b521baa-5390-41d7-8654-7b556346833d","Type":"ContainerStarted","Data":"7dc1743bc74323dd41a3cec19a711daff32823499ca98cc9fde89e6f5c06cd11"} Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.586417 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:d65dbfc956e9cf376f3c48fc3a0942cb7306b5164f898c40d1efca106df81db7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" podUID="2b31b20d-186a-4fb2-bfa2-914e5eda233e" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.586670 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" event={"ID":"c9fbb19b-a6b3-47d2-b5c8-6574fa71c069","Type":"ContainerStarted","Data":"3029e4adb98493f5fb143067a784319a64252c68fb0980b479578d634c0475d1"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.587626 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" event={"ID":"796eea14-80b6-43fc-b682-1fdaf61253ee","Type":"ContainerStarted","Data":"8906074a0ccafd27685cbf01443a8b706ef9ccd89b9cd617af3044b513397fc3"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.588402 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" event={"ID":"3811206c-bdee-4ea2-9f7c-7be96426f677","Type":"ContainerStarted","Data":"bdba414b97140ba5ab65623dff1f21791829c0372454d56ea4b61770bb7e57aa"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.591430 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" event={"ID":"657e5afd-3aba-4acf-a85d-36ef32e8c5f8","Type":"ContainerStarted","Data":"fba4628f3a23a41d0196711f4c24d020dd9d93d6724b2cfa33c9fc08c666d0e2"} Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.592009 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e00a9ed0ab26c5b745bd804ab1fe6b22428d026f17ea05a05f045e060342f46c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" podUID="3811206c-bdee-4ea2-9f7c-7be96426f677" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.593576 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" podUID="657e5afd-3aba-4acf-a85d-36ef32e8c5f8" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.593709 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" event={"ID":"90b33e2e-c7dc-4e9e-b479-dca5251277bc","Type":"ContainerStarted","Data":"f9fe6d72e209c1a8e4110ab99de74252eec00e48fa375f7359890df6b4a82bdc"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.594703 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" event={"ID":"c6c80d48-ce10-48bd-8cfb-67db8079dc1b","Type":"ContainerStarted","Data":"3930f48e90be8bdfd6e84542237ce7318331382fa48030d2c8a3f20863d0d5af"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.595968 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" event={"ID":"d85e5da5-d129-4904-8bde-6ff4bb92614f","Type":"ContainerStarted","Data":"3c010cc989bcab151981dbb44524e786ab590e0b39307c0b83c773ab98d51d2c"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.597176 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" event={"ID":"dc55c2f3-1e8f-48ac-9d6d-581737e07566","Type":"ContainerStarted","Data":"141d2f4002fd68d248d5b3d9f7a6c9110e54784177047a4a13231465d3a72d38"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.598662 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" event={"ID":"8911dae6-36bd-410e-847a-c7c7134bb5a4","Type":"ContainerStarted","Data":"7bc6b506137e75a24fd257d175db2ddd48ceb6d1e621105ef472ea58786e791d"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.600094 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" event={"ID":"c476edc2-bbe3-4dca-a1fc-9a9c95f758c3","Type":"ContainerStarted","Data":"43c4eddf0eb14f3598e502d04ba99ebaa937f59cf4b83984219dede227becb9b"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.606382 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" event={"ID":"702bd8a5-fc1e-4ee9-b85b-01ea9d177a97","Type":"ContainerStarted","Data":"eb916838c26cdb562b4283dd0d5bf12ecbe050aa5bd7fc1f54e2def8133a587d"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.608679 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" event={"ID":"d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9","Type":"ContainerStarted","Data":"00dc24b94c1d16099eaeffa5bbe5744f30879ae2b03708ccefc72cd4009829ab"} Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.609646 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" podUID="d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.611150 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" event={"ID":"6ab4e014-587f-483e-83dd-4e0327e17828","Type":"ContainerStarted","Data":"99545797b69ca731c110d4c4a1077e6555586373154d62fa8b3012be3d46cb68"} Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.612850 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" event={"ID":"39e5784b-2de7-45cb-9741-a0840599fb52","Type":"ContainerStarted","Data":"68a8e9e336cd8a0f488dba710c9eaa8b08c410902b9e803c774ccde1cf207b14"} Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.614871 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" podUID="6ab4e014-587f-483e-83dd-4e0327e17828" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.616681 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" event={"ID":"ddae14f0-dc74-43ea-937c-144dd9ecdb62","Type":"ContainerStarted","Data":"8a337750c600c62cbfda97ea641b9d2e202f0764c92674f025a4c4f2c83d0d68"} Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.618455 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" podUID="ddae14f0-dc74-43ea-937c-144dd9ecdb62" Nov 26 12:23:52 crc kubenswrapper[4834]: I1126 12:23:52.873444 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.873662 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:52 crc kubenswrapper[4834]: E1126 12:23:52.873786 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert podName:e37121ad-deca-4553-ae91-2a61eb0f9aac nodeName:}" failed. No retries permitted until 2025-11-26 12:23:54.873769162 +0000 UTC m=+732.780982513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert") pod "openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" (UID: "e37121ad-deca-4553-ae91-2a61eb0f9aac") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:53 crc kubenswrapper[4834]: I1126 12:23:53.279407 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:53 crc kubenswrapper[4834]: I1126 12:23:53.279769 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.279621 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.279866 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:55.279840038 +0000 UTC m=+733.187053390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "metrics-server-cert" not found Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.279935 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.279993 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:55.279976705 +0000 UTC m=+733.187190057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "webhook-server-cert" not found Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.628228 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" podUID="d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9" Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.630717 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" podUID="657e5afd-3aba-4acf-a85d-36ef32e8c5f8" Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.630808 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" podUID="ddae14f0-dc74-43ea-937c-144dd9ecdb62" Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.630977 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e00a9ed0ab26c5b745bd804ab1fe6b22428d026f17ea05a05f045e060342f46c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" podUID="3811206c-bdee-4ea2-9f7c-7be96426f677" Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.631077 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:d65dbfc956e9cf376f3c48fc3a0942cb7306b5164f898c40d1efca106df81db7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" podUID="2b31b20d-186a-4fb2-bfa2-914e5eda233e" Nov 26 12:23:53 crc kubenswrapper[4834]: E1126 12:23:53.632198 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" podUID="6ab4e014-587f-483e-83dd-4e0327e17828" Nov 26 12:23:54 crc kubenswrapper[4834]: I1126 12:23:54.601060 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:54 crc kubenswrapper[4834]: E1126 12:23:54.601271 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:54 crc kubenswrapper[4834]: E1126 12:23:54.601338 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert podName:7f07c17d-8260-47a7-b1e1-0f16226838a7 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:58.601304354 +0000 UTC m=+736.508517706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert") pod "infra-operator-controller-manager-57548d458d-btq96" (UID: "7f07c17d-8260-47a7-b1e1-0f16226838a7") : secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:54 crc kubenswrapper[4834]: I1126 12:23:54.904651 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:54 crc kubenswrapper[4834]: E1126 12:23:54.904805 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:54 crc kubenswrapper[4834]: E1126 12:23:54.904865 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert podName:e37121ad-deca-4553-ae91-2a61eb0f9aac nodeName:}" failed. No retries permitted until 2025-11-26 12:23:58.904848091 +0000 UTC m=+736.812061443 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert") pod "openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" (UID: "e37121ad-deca-4553-ae91-2a61eb0f9aac") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:55 crc kubenswrapper[4834]: I1126 12:23:55.309023 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:55 crc kubenswrapper[4834]: I1126 12:23:55.309111 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:55 crc kubenswrapper[4834]: E1126 12:23:55.309236 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 12:23:55 crc kubenswrapper[4834]: E1126 12:23:55.309259 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 12:23:55 crc kubenswrapper[4834]: E1126 12:23:55.309289 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:59.309277312 +0000 UTC m=+737.216490664 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "webhook-server-cert" not found Nov 26 12:23:55 crc kubenswrapper[4834]: E1126 12:23:55.309358 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:23:59.309336865 +0000 UTC m=+737.216550216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "metrics-server-cert" not found Nov 26 12:23:58 crc kubenswrapper[4834]: I1126 12:23:58.654808 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:23:58 crc kubenswrapper[4834]: E1126 12:23:58.655016 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:58 crc kubenswrapper[4834]: E1126 12:23:58.655264 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert podName:7f07c17d-8260-47a7-b1e1-0f16226838a7 nodeName:}" failed. No retries permitted until 2025-11-26 12:24:06.655244779 +0000 UTC m=+744.562458131 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert") pod "infra-operator-controller-manager-57548d458d-btq96" (UID: "7f07c17d-8260-47a7-b1e1-0f16226838a7") : secret "infra-operator-webhook-server-cert" not found Nov 26 12:23:58 crc kubenswrapper[4834]: I1126 12:23:58.959694 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:23:58 crc kubenswrapper[4834]: E1126 12:23:58.959952 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:58 crc kubenswrapper[4834]: E1126 12:23:58.960060 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert podName:e37121ad-deca-4553-ae91-2a61eb0f9aac nodeName:}" failed. No retries permitted until 2025-11-26 12:24:06.960038774 +0000 UTC m=+744.867252126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert") pod "openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" (UID: "e37121ad-deca-4553-ae91-2a61eb0f9aac") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 12:23:59 crc kubenswrapper[4834]: I1126 12:23:59.365963 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:59 crc kubenswrapper[4834]: I1126 12:23:59.366076 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:23:59 crc kubenswrapper[4834]: E1126 12:23:59.366136 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 12:23:59 crc kubenswrapper[4834]: E1126 12:23:59.366176 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 12:23:59 crc kubenswrapper[4834]: E1126 12:23:59.366208 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:24:07.366191835 +0000 UTC m=+745.273405187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "metrics-server-cert" not found Nov 26 12:23:59 crc kubenswrapper[4834]: E1126 12:23:59.366225 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs podName:de8f1d07-0e85-4245-a378-51c81152ef64 nodeName:}" failed. No retries permitted until 2025-11-26 12:24:07.366217644 +0000 UTC m=+745.273430996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs") pod "openstack-operator-controller-manager-659d75f7c6-78m9j" (UID: "de8f1d07-0e85-4245-a378-51c81152ef64") : secret "webhook-server-cert" not found Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.126245 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gv6qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-66f4dd4bc7-6phws_openstack-operators(8911dae6-36bd-410e-847a-c7c7134bb5a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.127732 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" podUID="8911dae6-36bd-410e-847a-c7c7134bb5a4" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.130766 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6d7kg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7b4567c7cf-2gfhc_openstack-operators(c6c80d48-ce10-48bd-8cfb-67db8079dc1b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.130952 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-crrjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57988cc5b5-ppdpl_openstack-operators(39e5784b-2de7-45cb-9741-a0840599fb52): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.132076 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" podUID="c6c80d48-ce10-48bd-8cfb-67db8079dc1b" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.132146 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" podUID="39e5784b-2de7-45cb-9741-a0840599fb52" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.134893 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6pzh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6b7f75547b-m4bdb_openstack-operators(d85e5da5-d129-4904-8bde-6ff4bb92614f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.136781 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" podUID="d85e5da5-d129-4904-8bde-6ff4bb92614f" Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.688882 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" event={"ID":"d85e5da5-d129-4904-8bde-6ff4bb92614f","Type":"ContainerStarted","Data":"1daf534db7a8c96587b03386bd0ad7819d989472aa5471cd286d58e4c344e2ce"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.689425 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.690971 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" podUID="d85e5da5-d129-4904-8bde-6ff4bb92614f" Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.692032 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" event={"ID":"7b521baa-5390-41d7-8654-7b556346833d","Type":"ContainerStarted","Data":"77a368d97230a17590a6c1a57e3428231a50bababde84cd7cea62b458bee78a8"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.693103 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" event={"ID":"c476edc2-bbe3-4dca-a1fc-9a9c95f758c3","Type":"ContainerStarted","Data":"7be722cd72065020fd9997fa99cefce144476daa52008b1fdd5dd48fd36ba2b1"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.694376 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" event={"ID":"796eea14-80b6-43fc-b682-1fdaf61253ee","Type":"ContainerStarted","Data":"ad808b50fbc419f97fb5b0245d5409be3351fb10b394920fc6a075dee07b274e"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.697106 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" event={"ID":"dc55c2f3-1e8f-48ac-9d6d-581737e07566","Type":"ContainerStarted","Data":"a268624008eb0270d06e1c606807aad1831c60829171f164fc6d6479cd9ada30"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.697918 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" event={"ID":"8911dae6-36bd-410e-847a-c7c7134bb5a4","Type":"ContainerStarted","Data":"3937a23c9cf274dd328b704b2aafa6f0655d1a275e0df4ebbcc5f42fb08d5d9b"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.698219 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.699680 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" podUID="8911dae6-36bd-410e-847a-c7c7134bb5a4" Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.701626 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" event={"ID":"a6b8f0bf-a405-4b2a-91b2-1934cd2997b2","Type":"ContainerStarted","Data":"cf32692bd07f4559e77a83770c2389409dd8ffadd38db92a5806726a31dbd660"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.703020 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" event={"ID":"90b33e2e-c7dc-4e9e-b479-dca5251277bc","Type":"ContainerStarted","Data":"016545dd06e05c23fa1afd8a472dc3777822b400cef0cdecf3e1449e1ae062e1"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.705254 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" event={"ID":"39e5784b-2de7-45cb-9741-a0840599fb52","Type":"ContainerStarted","Data":"747f6ced5f10788885db91bb827f57207c0a6059562e4499231460c776d64ccd"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.705952 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.706264 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" podUID="39e5784b-2de7-45cb-9741-a0840599fb52" Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.715836 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" event={"ID":"c6c80d48-ce10-48bd-8cfb-67db8079dc1b","Type":"ContainerStarted","Data":"e04612a7e388fbd56d633257e830d9e02ce33ae5e5336a1d4b893a38358c1465"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.715977 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" Nov 26 12:24:04 crc kubenswrapper[4834]: E1126 12:24:04.718061 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" podUID="c6c80d48-ce10-48bd-8cfb-67db8079dc1b" Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.723573 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" event={"ID":"14c2b899-bac2-43dd-844e-a66f4d75954a","Type":"ContainerStarted","Data":"3856c53ae1989968472651551581e3904f82b63e0553e99800683cc554945e02"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.729342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" event={"ID":"c9fbb19b-a6b3-47d2-b5c8-6574fa71c069","Type":"ContainerStarted","Data":"4c96edf5550eb1b1f05b9f2794a1cc66c9e25d8d252d745d9658322c52dcbff8"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.732481 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" event={"ID":"52aee873-a614-4018-aa3d-beb4021c29f6","Type":"ContainerStarted","Data":"b59456343e9b54e44c1b9193f314e5455f9fb272d16d1d1a069cad188a4a565a"} Nov 26 12:24:04 crc kubenswrapper[4834]: I1126 12:24:04.738565 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" event={"ID":"702bd8a5-fc1e-4ee9-b85b-01ea9d177a97","Type":"ContainerStarted","Data":"31c2c6a0186f622599b8160f3c6853ba472b18d2e96cb2595aaed91285864053"} Nov 26 12:24:05 crc kubenswrapper[4834]: E1126 12:24:05.748089 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" podUID="8911dae6-36bd-410e-847a-c7c7134bb5a4" Nov 26 12:24:05 crc kubenswrapper[4834]: E1126 12:24:05.748506 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" podUID="d85e5da5-d129-4904-8bde-6ff4bb92614f" Nov 26 12:24:05 crc kubenswrapper[4834]: E1126 12:24:05.748862 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" podUID="c6c80d48-ce10-48bd-8cfb-67db8079dc1b" Nov 26 12:24:05 crc kubenswrapper[4834]: E1126 12:24:05.749269 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" podUID="39e5784b-2de7-45cb-9741-a0840599fb52" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.671960 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:24:06 crc kubenswrapper[4834]: E1126 12:24:06.672158 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 12:24:06 crc kubenswrapper[4834]: E1126 12:24:06.672601 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert podName:7f07c17d-8260-47a7-b1e1-0f16226838a7 nodeName:}" failed. No retries permitted until 2025-11-26 12:24:22.67258512 +0000 UTC m=+760.579798473 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert") pod "infra-operator-controller-manager-57548d458d-btq96" (UID: "7f07c17d-8260-47a7-b1e1-0f16226838a7") : secret "infra-operator-webhook-server-cert" not found Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.756470 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" event={"ID":"796eea14-80b6-43fc-b682-1fdaf61253ee","Type":"ContainerStarted","Data":"bfa05de2dca0e314faf66a985b2f18da837d5ffa59c0c30f495bb6b6ae3653c6"} Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.757286 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.762946 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" event={"ID":"dc55c2f3-1e8f-48ac-9d6d-581737e07566","Type":"ContainerStarted","Data":"2080b15f848c6c1dae220f5d90f07ea51559ee614978cc325ac11fb953512533"} Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.763121 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.765773 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" event={"ID":"c476edc2-bbe3-4dca-a1fc-9a9c95f758c3","Type":"ContainerStarted","Data":"add4c185efa0af478597b4d5c11c5a551809a23d16251c272cc0251442381236"} Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.765807 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.767901 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" event={"ID":"702bd8a5-fc1e-4ee9-b85b-01ea9d177a97","Type":"ContainerStarted","Data":"c64e548136bd1f96cfcc84bc0e9ef0dcf50cc169c3cca2f039853bfa3bfd9f9e"} Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.768383 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.774281 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" event={"ID":"a6b8f0bf-a405-4b2a-91b2-1934cd2997b2","Type":"ContainerStarted","Data":"9abd35c6cad6687625f08afad3a91dc2ce7dd4fc8389f1fc19444dae05784dee"} Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.774498 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.777752 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" event={"ID":"90b33e2e-c7dc-4e9e-b479-dca5251277bc","Type":"ContainerStarted","Data":"c54e998e28a4c78e1c43679dd8b3823b62bc06d6c37e1ba962d4f5868a4677e7"} Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.778724 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" podStartSLOduration=1.708626134 podStartE2EDuration="15.77870788s" podCreationTimestamp="2025-11-26 12:23:51 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.388641016 +0000 UTC m=+730.295854369" lastFinishedPulling="2025-11-26 12:24:06.458722763 +0000 UTC m=+744.365936115" observedRunningTime="2025-11-26 12:24:06.772909887 +0000 UTC m=+744.680123240" watchObservedRunningTime="2025-11-26 12:24:06.77870788 +0000 UTC m=+744.685921231" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.794555 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" podStartSLOduration=2.4416377479999998 podStartE2EDuration="16.794535541s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.148605662 +0000 UTC m=+730.055819013" lastFinishedPulling="2025-11-26 12:24:06.501503454 +0000 UTC m=+744.408716806" observedRunningTime="2025-11-26 12:24:06.787179272 +0000 UTC m=+744.694392624" watchObservedRunningTime="2025-11-26 12:24:06.794535541 +0000 UTC m=+744.701748894" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.807730 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" podStartSLOduration=2.321111193 podStartE2EDuration="16.807713248s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.016150594 +0000 UTC m=+729.923363946" lastFinishedPulling="2025-11-26 12:24:06.502752649 +0000 UTC m=+744.409966001" observedRunningTime="2025-11-26 12:24:06.804567536 +0000 UTC m=+744.711780888" watchObservedRunningTime="2025-11-26 12:24:06.807713248 +0000 UTC m=+744.714926599" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.819945 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" podStartSLOduration=2.407272835 podStartE2EDuration="16.819927296s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.025644755 +0000 UTC m=+729.932858107" lastFinishedPulling="2025-11-26 12:24:06.438299216 +0000 UTC m=+744.345512568" observedRunningTime="2025-11-26 12:24:06.816122563 +0000 UTC m=+744.723335915" watchObservedRunningTime="2025-11-26 12:24:06.819927296 +0000 UTC m=+744.727140648" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.836118 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" podStartSLOduration=2.031413931 podStartE2EDuration="16.836102073s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:51.729592532 +0000 UTC m=+729.636805884" lastFinishedPulling="2025-11-26 12:24:06.534280674 +0000 UTC m=+744.441494026" observedRunningTime="2025-11-26 12:24:06.831873179 +0000 UTC m=+744.739086532" watchObservedRunningTime="2025-11-26 12:24:06.836102073 +0000 UTC m=+744.743315425" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.844609 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" podStartSLOduration=2.612683878 podStartE2EDuration="16.844596349s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.206993228 +0000 UTC m=+730.114206580" lastFinishedPulling="2025-11-26 12:24:06.438905699 +0000 UTC m=+744.346119051" observedRunningTime="2025-11-26 12:24:06.844344945 +0000 UTC m=+744.751558297" watchObservedRunningTime="2025-11-26 12:24:06.844596349 +0000 UTC m=+744.751809700" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.978028 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:24:06 crc kubenswrapper[4834]: I1126 12:24:06.986559 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37121ad-deca-4553-ae91-2a61eb0f9aac-cert\") pod \"openstack-baremetal-operator-controller-manager-674cb676c8z4lrf\" (UID: \"e37121ad-deca-4553-ae91-2a61eb0f9aac\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.043648 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.384989 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.385065 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.391009 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-webhook-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.391554 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8f1d07-0e85-4245-a378-51c81152ef64-metrics-certs\") pod \"openstack-operator-controller-manager-659d75f7c6-78m9j\" (UID: \"de8f1d07-0e85-4245-a378-51c81152ef64\") " pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.441259 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf"] Nov 26 12:24:07 crc kubenswrapper[4834]: W1126 12:24:07.446464 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37121ad_deca_4553_ae91_2a61eb0f9aac.slice/crio-2484ded37e04ae89a66cf9d18bb4752a89c834793cff52656c44b5fc352680cb WatchSource:0}: Error finding container 2484ded37e04ae89a66cf9d18bb4752a89c834793cff52656c44b5fc352680cb: Status 404 returned error can't find the container with id 2484ded37e04ae89a66cf9d18bb4752a89c834793cff52656c44b5fc352680cb Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.462270 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.785538 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" event={"ID":"e37121ad-deca-4553-ae91-2a61eb0f9aac","Type":"ContainerStarted","Data":"2484ded37e04ae89a66cf9d18bb4752a89c834793cff52656c44b5fc352680cb"} Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.788835 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" event={"ID":"c9fbb19b-a6b3-47d2-b5c8-6574fa71c069","Type":"ContainerStarted","Data":"c9882e7e11908a6ab14a9a49df3dfb8a84ec2a630d2f4fad110f6dbdba48e6b4"} Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.789668 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.791339 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" event={"ID":"14c2b899-bac2-43dd-844e-a66f4d75954a","Type":"ContainerStarted","Data":"58549e6577f5c9f0e9a816bff847011af6994962016b9c432824a82fab23400c"} Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.791837 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.793142 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" event={"ID":"52aee873-a614-4018-aa3d-beb4021c29f6","Type":"ContainerStarted","Data":"2005eb59a9332d212cae5e8fe3d32deaaf9ab3b6a1bb33f6af8a9664939ec375"} Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.793682 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.796985 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" event={"ID":"7b521baa-5390-41d7-8654-7b556346833d","Type":"ContainerStarted","Data":"33e7f1d7dea0a9585f49fa35e55bb642c26ee25da5020483042969524f41717c"} Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.797564 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.797846 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.805893 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" podStartSLOduration=2.145446414 podStartE2EDuration="16.805880152s" podCreationTimestamp="2025-11-26 12:23:51 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.234758869 +0000 UTC m=+730.141972210" lastFinishedPulling="2025-11-26 12:24:06.895192596 +0000 UTC m=+744.802405948" observedRunningTime="2025-11-26 12:24:07.805413181 +0000 UTC m=+745.712626533" watchObservedRunningTime="2025-11-26 12:24:07.805880152 +0000 UTC m=+745.713093504" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.828538 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" podStartSLOduration=2.76628197 podStartE2EDuration="17.828527442s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.036506805 +0000 UTC m=+729.943720157" lastFinishedPulling="2025-11-26 12:24:07.098752277 +0000 UTC m=+745.005965629" observedRunningTime="2025-11-26 12:24:07.819474223 +0000 UTC m=+745.726687575" watchObservedRunningTime="2025-11-26 12:24:07.828527442 +0000 UTC m=+745.735740795" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.836473 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" podStartSLOduration=3.138360486 podStartE2EDuration="17.83646588s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.220479345 +0000 UTC m=+730.127692698" lastFinishedPulling="2025-11-26 12:24:06.91858474 +0000 UTC m=+744.825798092" observedRunningTime="2025-11-26 12:24:07.836056989 +0000 UTC m=+745.743270341" watchObservedRunningTime="2025-11-26 12:24:07.83646588 +0000 UTC m=+745.743679232" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.849425 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" podStartSLOduration=2.163326507 podStartE2EDuration="16.849417159s" podCreationTimestamp="2025-11-26 12:23:51 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.491565305 +0000 UTC m=+730.398778657" lastFinishedPulling="2025-11-26 12:24:07.177655957 +0000 UTC m=+745.084869309" observedRunningTime="2025-11-26 12:24:07.847389156 +0000 UTC m=+745.754602508" watchObservedRunningTime="2025-11-26 12:24:07.849417159 +0000 UTC m=+745.756630511" Nov 26 12:24:07 crc kubenswrapper[4834]: I1126 12:24:07.882126 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j"] Nov 26 12:24:08 crc kubenswrapper[4834]: W1126 12:24:08.147806 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde8f1d07_0e85_4245_a378_51c81152ef64.slice/crio-cd6d94e193e93fba07cdb233a361392e656498485382c084359c09a7a52374c2 WatchSource:0}: Error finding container cd6d94e193e93fba07cdb233a361392e656498485382c084359c09a7a52374c2: Status 404 returned error can't find the container with id cd6d94e193e93fba07cdb233a361392e656498485382c084359c09a7a52374c2 Nov 26 12:24:08 crc kubenswrapper[4834]: I1126 12:24:08.805233 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" event={"ID":"de8f1d07-0e85-4245-a378-51c81152ef64","Type":"ContainerStarted","Data":"cd6d94e193e93fba07cdb233a361392e656498485382c084359c09a7a52374c2"} Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.813029 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" event={"ID":"6ab4e014-587f-483e-83dd-4e0327e17828","Type":"ContainerStarted","Data":"7539e9e2072e91274dd095e208b915367e99f23cb6438a7658490793b6523e85"} Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.813089 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" event={"ID":"6ab4e014-587f-483e-83dd-4e0327e17828","Type":"ContainerStarted","Data":"817b6e9027c977fea54dfc7f50ec518b904e3c355608a202f30d6ff501ffbed4"} Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.813243 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.814501 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" event={"ID":"3811206c-bdee-4ea2-9f7c-7be96426f677","Type":"ContainerStarted","Data":"50f1df31cdd1cf2947e54aacc1caca4be14c97214d137f215a5f8f0e1587d5b4"} Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.814524 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" event={"ID":"3811206c-bdee-4ea2-9f7c-7be96426f677","Type":"ContainerStarted","Data":"5d27e14532c672987ae8c7d87e21592b5b31d6b10daba6d60787d1e4f9c3cfac"} Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.814861 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.816342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" event={"ID":"de8f1d07-0e85-4245-a378-51c81152ef64","Type":"ContainerStarted","Data":"a04c1ccbf0adc6314b1d4859f1991b87198c7861489852cd87ef7ccc02d2739e"} Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.816676 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.818304 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" event={"ID":"657e5afd-3aba-4acf-a85d-36ef32e8c5f8","Type":"ContainerStarted","Data":"1952f0ceb1964bef02b474c181749f2132e7ec32c4d158be454246e20cf81c8f"} Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.818348 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" event={"ID":"657e5afd-3aba-4acf-a85d-36ef32e8c5f8","Type":"ContainerStarted","Data":"d63a47c1726c41d5c1a63502a2e75dfc94d31a4ee43042a561edc444ee72b8c1"} Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.818715 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.821123 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-n4xj9" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.821692 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-zbptq" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.823461 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d77b94747-jtv6r" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.824497 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-dfcmd" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.825082 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" podStartSLOduration=2.269650377 podStartE2EDuration="18.825062351s" podCreationTimestamp="2025-11-26 12:23:51 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.389807536 +0000 UTC m=+730.297020888" lastFinishedPulling="2025-11-26 12:24:08.94521951 +0000 UTC m=+746.852432862" observedRunningTime="2025-11-26 12:24:09.824754461 +0000 UTC m=+747.731967813" watchObservedRunningTime="2025-11-26 12:24:09.825062351 +0000 UTC m=+747.732275703" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.839255 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" podStartSLOduration=3.281775137 podStartE2EDuration="19.839239782s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.388815916 +0000 UTC m=+730.296029268" lastFinishedPulling="2025-11-26 12:24:08.94628056 +0000 UTC m=+746.853493913" observedRunningTime="2025-11-26 12:24:09.836267016 +0000 UTC m=+747.743480369" watchObservedRunningTime="2025-11-26 12:24:09.839239782 +0000 UTC m=+747.746453134" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.897325 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" podStartSLOduration=18.897291105 podStartE2EDuration="18.897291105s" podCreationTimestamp="2025-11-26 12:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:24:09.887972655 +0000 UTC m=+747.795186007" watchObservedRunningTime="2025-11-26 12:24:09.897291105 +0000 UTC m=+747.804504457" Nov 26 12:24:09 crc kubenswrapper[4834]: I1126 12:24:09.945751 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" podStartSLOduration=3.396941637 podStartE2EDuration="19.945735604s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.397274935 +0000 UTC m=+730.304488287" lastFinishedPulling="2025-11-26 12:24:08.946068902 +0000 UTC m=+746.853282254" observedRunningTime="2025-11-26 12:24:09.942490637 +0000 UTC m=+747.849703988" watchObservedRunningTime="2025-11-26 12:24:09.945735604 +0000 UTC m=+747.852948957" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.142128 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-c5rjq" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.148267 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.178284 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-st8n9" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.190146 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-955677c94-7vd97" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.198746 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-h4t44" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.320215 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.340249 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-ffwjp" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.344398 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.444670 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-5s84k" Nov 26 12:24:11 crc kubenswrapper[4834]: I1126 12:24:11.459057 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.841823 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" event={"ID":"c6c80d48-ce10-48bd-8cfb-67db8079dc1b","Type":"ContainerStarted","Data":"22547309bfeea8f315b39566cbce84aff637e796d8f292520d9e74b198c25716"} Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.844933 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" event={"ID":"d85e5da5-d129-4904-8bde-6ff4bb92614f","Type":"ContainerStarted","Data":"77495747100ad7a9f2d599d5e236a136fef24a878aca972d8a2ad749b58c74bb"} Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.848615 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" event={"ID":"2b31b20d-186a-4fb2-bfa2-914e5eda233e","Type":"ContainerStarted","Data":"d872673325d03d22a616cc41b2ebf632da9f05be89f50d8d235a8f313b3aef86"} Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.853602 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" event={"ID":"ddae14f0-dc74-43ea-937c-144dd9ecdb62","Type":"ContainerStarted","Data":"f31436944145e44c43ed6881ca16000ed6028f4f716cc66411362df8d4228710"} Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.856549 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" event={"ID":"8911dae6-36bd-410e-847a-c7c7134bb5a4","Type":"ContainerStarted","Data":"efca76ea9829dfcb13fb612a6e3cdd2ef3efffea4fa1519c64688cb3f8aff9da"} Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.858470 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" event={"ID":"d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9","Type":"ContainerStarted","Data":"261bf9b9d3815387166184d5c727a16f0be24fa8ed73e8b3628edcd29279f2ed"} Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.860441 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-2gfhc" podStartSLOduration=12.600696624 podStartE2EDuration="22.860430007s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.038197893 +0000 UTC m=+729.945411244" lastFinishedPulling="2025-11-26 12:24:02.297931275 +0000 UTC m=+740.205144627" observedRunningTime="2025-11-26 12:24:12.859077558 +0000 UTC m=+750.766290899" watchObservedRunningTime="2025-11-26 12:24:12.860430007 +0000 UTC m=+750.767643359" Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.864349 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" event={"ID":"e37121ad-deca-4553-ae91-2a61eb0f9aac","Type":"ContainerStarted","Data":"13dee283cea3a21276b896e24c64f8e6b4b5feb860465e7e8eb412c719248c84"} Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.864740 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.871012 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" event={"ID":"39e5784b-2de7-45cb-9741-a0840599fb52","Type":"ContainerStarted","Data":"de9f6015dbb9d871b4dd4f6dc34237e137a87795a3592c324828adf55840be3e"} Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.881565 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qmpxl" podStartSLOduration=1.937447273 podStartE2EDuration="21.881544077s" podCreationTimestamp="2025-11-26 12:23:51 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.507981026 +0000 UTC m=+730.415194378" lastFinishedPulling="2025-11-26 12:24:12.452077829 +0000 UTC m=+750.359291182" observedRunningTime="2025-11-26 12:24:12.878393356 +0000 UTC m=+750.785606708" watchObservedRunningTime="2025-11-26 12:24:12.881544077 +0000 UTC m=+750.788757429" Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.892631 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-m4bdb" podStartSLOduration=12.739077214 podStartE2EDuration="22.892618327s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:51.683178531 +0000 UTC m=+729.590391893" lastFinishedPulling="2025-11-26 12:24:01.836719653 +0000 UTC m=+739.743933006" observedRunningTime="2025-11-26 12:24:12.890990498 +0000 UTC m=+750.798203850" watchObservedRunningTime="2025-11-26 12:24:12.892618327 +0000 UTC m=+750.799831679" Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.913576 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-6phws" podStartSLOduration=12.903414995 podStartE2EDuration="22.91355941s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.242776956 +0000 UTC m=+730.149990308" lastFinishedPulling="2025-11-26 12:24:02.252921372 +0000 UTC m=+740.160134723" observedRunningTime="2025-11-26 12:24:12.909742994 +0000 UTC m=+750.816956346" watchObservedRunningTime="2025-11-26 12:24:12.91355941 +0000 UTC m=+750.820772763" Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.945431 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-ppdpl" podStartSLOduration=11.880971068000001 podStartE2EDuration="21.945415784s" podCreationTimestamp="2025-11-26 12:23:51 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.232952503 +0000 UTC m=+730.140165855" lastFinishedPulling="2025-11-26 12:24:02.297397218 +0000 UTC m=+740.204610571" observedRunningTime="2025-11-26 12:24:12.944767151 +0000 UTC m=+750.851980503" watchObservedRunningTime="2025-11-26 12:24:12.945415784 +0000 UTC m=+750.852629135" Nov 26 12:24:12 crc kubenswrapper[4834]: I1126 12:24:12.979642 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" podStartSLOduration=17.003767549 podStartE2EDuration="21.979615686s" podCreationTimestamp="2025-11-26 12:23:51 +0000 UTC" firstStartedPulling="2025-11-26 12:24:07.449723255 +0000 UTC m=+745.356936608" lastFinishedPulling="2025-11-26 12:24:12.425571392 +0000 UTC m=+750.332784745" observedRunningTime="2025-11-26 12:24:12.97346511 +0000 UTC m=+750.880678461" watchObservedRunningTime="2025-11-26 12:24:12.979615686 +0000 UTC m=+750.886829038" Nov 26 12:24:13 crc kubenswrapper[4834]: I1126 12:24:13.880432 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" event={"ID":"ddae14f0-dc74-43ea-937c-144dd9ecdb62","Type":"ContainerStarted","Data":"454c9f5af50ef3fab1637ac366e6905dd130857ec3c14ca1c75029d25860f51a"} Nov 26 12:24:13 crc kubenswrapper[4834]: I1126 12:24:13.880715 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" Nov 26 12:24:13 crc kubenswrapper[4834]: I1126 12:24:13.882685 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" event={"ID":"e37121ad-deca-4553-ae91-2a61eb0f9aac","Type":"ContainerStarted","Data":"6164766dd76048ad50eb0a79850066b3b4504777595ef2aba7934d57ee9cabcf"} Nov 26 12:24:13 crc kubenswrapper[4834]: I1126 12:24:13.886135 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" event={"ID":"2b31b20d-186a-4fb2-bfa2-914e5eda233e","Type":"ContainerStarted","Data":"93e2c6af4d12d6de3c1795739570c54be979e05eed91c65b1928cc4a7d62c872"} Nov 26 12:24:13 crc kubenswrapper[4834]: I1126 12:24:13.887032 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" Nov 26 12:24:13 crc kubenswrapper[4834]: I1126 12:24:13.902004 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" podStartSLOduration=2.976631263 podStartE2EDuration="22.90198646s" podCreationTimestamp="2025-11-26 12:23:51 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.501595146 +0000 UTC m=+730.408808498" lastFinishedPulling="2025-11-26 12:24:12.426950343 +0000 UTC m=+750.334163695" observedRunningTime="2025-11-26 12:24:13.897794387 +0000 UTC m=+751.805007739" watchObservedRunningTime="2025-11-26 12:24:13.90198646 +0000 UTC m=+751.809199812" Nov 26 12:24:13 crc kubenswrapper[4834]: I1126 12:24:13.913656 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" podStartSLOduration=3.8531028259999998 podStartE2EDuration="23.91364403s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:23:52.391430766 +0000 UTC m=+730.298644118" lastFinishedPulling="2025-11-26 12:24:12.45197197 +0000 UTC m=+750.359185322" observedRunningTime="2025-11-26 12:24:13.909303216 +0000 UTC m=+751.816516568" watchObservedRunningTime="2025-11-26 12:24:13.91364403 +0000 UTC m=+751.820857383" Nov 26 12:24:17 crc kubenswrapper[4834]: I1126 12:24:17.048620 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-674cb676c8z4lrf" Nov 26 12:24:17 crc kubenswrapper[4834]: I1126 12:24:17.467643 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-659d75f7c6-78m9j" Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.430934 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-kcp4x" Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.531123 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.531494 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.531542 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.532245 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a287c3eb83da9750c870469d031aef3c2e315d099f1dc03a585c96cc0c81709e"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.532342 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://a287c3eb83da9750c870469d031aef3c2e315d099f1dc03a585c96cc0c81709e" gracePeriod=600 Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.563718 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xf2w7" Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.620683 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-wz75v" Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.666569 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-7nh8j" Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.767576 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-7s4pw" Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.926820 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="a287c3eb83da9750c870469d031aef3c2e315d099f1dc03a585c96cc0c81709e" exitCode=0 Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.926898 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"a287c3eb83da9750c870469d031aef3c2e315d099f1dc03a585c96cc0c81709e"} Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.927168 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"a30f1ec8aaa63fca565600131ca721203e6bad41e9594f352f74db2095a7e3eb"} Nov 26 12:24:21 crc kubenswrapper[4834]: I1126 12:24:21.927192 4834 scope.go:117] "RemoveContainer" containerID="758813397b9c2c2ccad432b9cb11a0273141f48fa83e1aaad525d33be881d337" Nov 26 12:24:22 crc kubenswrapper[4834]: I1126 12:24:22.682767 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:24:22 crc kubenswrapper[4834]: I1126 12:24:22.690436 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f07c17d-8260-47a7-b1e1-0f16226838a7-cert\") pod \"infra-operator-controller-manager-57548d458d-btq96\" (UID: \"7f07c17d-8260-47a7-b1e1-0f16226838a7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:24:22 crc kubenswrapper[4834]: I1126 12:24:22.717666 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:24:23 crc kubenswrapper[4834]: I1126 12:24:23.072593 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-btq96"] Nov 26 12:24:23 crc kubenswrapper[4834]: I1126 12:24:23.940717 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" event={"ID":"7f07c17d-8260-47a7-b1e1-0f16226838a7","Type":"ContainerStarted","Data":"8a3d31d9e677f64cb0b11a315202476b4c6febe42d1e9075d7d13c046d3259d7"} Nov 26 12:24:25 crc kubenswrapper[4834]: I1126 12:24:25.953190 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" event={"ID":"7f07c17d-8260-47a7-b1e1-0f16226838a7","Type":"ContainerStarted","Data":"3da9d3f3fa96874e17ac9fc5ceef5bb081b232c10f927fd1930be69ebd8d10cb"} Nov 26 12:24:25 crc kubenswrapper[4834]: I1126 12:24:25.953694 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:24:25 crc kubenswrapper[4834]: I1126 12:24:25.953707 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" event={"ID":"7f07c17d-8260-47a7-b1e1-0f16226838a7","Type":"ContainerStarted","Data":"9998a8cb51281700ad3c15340f61bf4ce33a955e16c732ce49ff4a5d83ded5de"} Nov 26 12:24:25 crc kubenswrapper[4834]: I1126 12:24:25.968905 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" podStartSLOduration=33.969108895 podStartE2EDuration="35.968891557s" podCreationTimestamp="2025-11-26 12:23:50 +0000 UTC" firstStartedPulling="2025-11-26 12:24:23.084836062 +0000 UTC m=+760.992049414" lastFinishedPulling="2025-11-26 12:24:25.084618724 +0000 UTC m=+762.991832076" observedRunningTime="2025-11-26 12:24:25.964604644 +0000 UTC m=+763.871817996" watchObservedRunningTime="2025-11-26 12:24:25.968891557 +0000 UTC m=+763.876104898" Nov 26 12:24:32 crc kubenswrapper[4834]: I1126 12:24:32.722762 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-btq96" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.149178 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xjkgv"] Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.152033 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.154374 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.154750 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4gcbn" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.154928 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.155037 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.163484 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xjkgv"] Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.171509 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f5cw\" (UniqueName: \"kubernetes.io/projected/07d60422-877a-4527-8ce0-c4d20cdba117-kube-api-access-4f5cw\") pod \"dnsmasq-dns-7bdd77c89-xjkgv\" (UID: \"07d60422-877a-4527-8ce0-c4d20cdba117\") " pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.171548 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d60422-877a-4527-8ce0-c4d20cdba117-config\") pod \"dnsmasq-dns-7bdd77c89-xjkgv\" (UID: \"07d60422-877a-4527-8ce0-c4d20cdba117\") " pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.215456 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6584b49599-twv8t"] Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.216545 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.218980 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.238566 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-twv8t"] Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.272875 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2lh\" (UniqueName: \"kubernetes.io/projected/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-kube-api-access-ql2lh\") pod \"dnsmasq-dns-6584b49599-twv8t\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.272982 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f5cw\" (UniqueName: \"kubernetes.io/projected/07d60422-877a-4527-8ce0-c4d20cdba117-kube-api-access-4f5cw\") pod \"dnsmasq-dns-7bdd77c89-xjkgv\" (UID: \"07d60422-877a-4527-8ce0-c4d20cdba117\") " pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.273009 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d60422-877a-4527-8ce0-c4d20cdba117-config\") pod \"dnsmasq-dns-7bdd77c89-xjkgv\" (UID: \"07d60422-877a-4527-8ce0-c4d20cdba117\") " pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.273064 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-config\") pod \"dnsmasq-dns-6584b49599-twv8t\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.273078 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-dns-svc\") pod \"dnsmasq-dns-6584b49599-twv8t\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.274044 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d60422-877a-4527-8ce0-c4d20cdba117-config\") pod \"dnsmasq-dns-7bdd77c89-xjkgv\" (UID: \"07d60422-877a-4527-8ce0-c4d20cdba117\") " pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.296475 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f5cw\" (UniqueName: \"kubernetes.io/projected/07d60422-877a-4527-8ce0-c4d20cdba117-kube-api-access-4f5cw\") pod \"dnsmasq-dns-7bdd77c89-xjkgv\" (UID: \"07d60422-877a-4527-8ce0-c4d20cdba117\") " pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.375751 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-config\") pod \"dnsmasq-dns-6584b49599-twv8t\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.375788 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-dns-svc\") pod \"dnsmasq-dns-6584b49599-twv8t\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.375844 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql2lh\" (UniqueName: \"kubernetes.io/projected/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-kube-api-access-ql2lh\") pod \"dnsmasq-dns-6584b49599-twv8t\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.376563 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-config\") pod \"dnsmasq-dns-6584b49599-twv8t\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.376879 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-dns-svc\") pod \"dnsmasq-dns-6584b49599-twv8t\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.390520 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql2lh\" (UniqueName: \"kubernetes.io/projected/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-kube-api-access-ql2lh\") pod \"dnsmasq-dns-6584b49599-twv8t\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.465796 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.544772 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.835070 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xjkgv"] Nov 26 12:24:45 crc kubenswrapper[4834]: W1126 12:24:45.838812 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07d60422_877a_4527_8ce0_c4d20cdba117.slice/crio-e00e5b123ff1c9fd70278d2ee8a9b504c70883a750fbc2c2e661ad258e1b396c WatchSource:0}: Error finding container e00e5b123ff1c9fd70278d2ee8a9b504c70883a750fbc2c2e661ad258e1b396c: Status 404 returned error can't find the container with id e00e5b123ff1c9fd70278d2ee8a9b504c70883a750fbc2c2e661ad258e1b396c Nov 26 12:24:45 crc kubenswrapper[4834]: I1126 12:24:45.933706 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-twv8t"] Nov 26 12:24:45 crc kubenswrapper[4834]: W1126 12:24:45.936991 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebfdfa9a_28df_482e_a947_1972ebe7e7d8.slice/crio-7684c3ca2926f7c7d561c17a180ecc1040137360446b9416c95d9c5a94bc7de0 WatchSource:0}: Error finding container 7684c3ca2926f7c7d561c17a180ecc1040137360446b9416c95d9c5a94bc7de0: Status 404 returned error can't find the container with id 7684c3ca2926f7c7d561c17a180ecc1040137360446b9416c95d9c5a94bc7de0 Nov 26 12:24:46 crc kubenswrapper[4834]: I1126 12:24:46.064952 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6584b49599-twv8t" event={"ID":"ebfdfa9a-28df-482e-a947-1972ebe7e7d8","Type":"ContainerStarted","Data":"7684c3ca2926f7c7d561c17a180ecc1040137360446b9416c95d9c5a94bc7de0"} Nov 26 12:24:46 crc kubenswrapper[4834]: I1126 12:24:46.065928 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" event={"ID":"07d60422-877a-4527-8ce0-c4d20cdba117","Type":"ContainerStarted","Data":"e00e5b123ff1c9fd70278d2ee8a9b504c70883a750fbc2c2e661ad258e1b396c"} Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.596018 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-twv8t"] Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.615278 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-pgp2x"] Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.616828 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.621802 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-pgp2x"] Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.625861 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-dns-svc\") pod \"dnsmasq-dns-7c6d9948dc-pgp2x\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.625928 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-config\") pod \"dnsmasq-dns-7c6d9948dc-pgp2x\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.626031 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss222\" (UniqueName: \"kubernetes.io/projected/649f4660-441e-45a2-bdcd-ed292bf1d153-kube-api-access-ss222\") pod \"dnsmasq-dns-7c6d9948dc-pgp2x\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.727646 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss222\" (UniqueName: \"kubernetes.io/projected/649f4660-441e-45a2-bdcd-ed292bf1d153-kube-api-access-ss222\") pod \"dnsmasq-dns-7c6d9948dc-pgp2x\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.727758 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-dns-svc\") pod \"dnsmasq-dns-7c6d9948dc-pgp2x\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.727814 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-config\") pod \"dnsmasq-dns-7c6d9948dc-pgp2x\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.728711 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-config\") pod \"dnsmasq-dns-7c6d9948dc-pgp2x\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.729526 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-dns-svc\") pod \"dnsmasq-dns-7c6d9948dc-pgp2x\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.750102 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss222\" (UniqueName: \"kubernetes.io/projected/649f4660-441e-45a2-bdcd-ed292bf1d153-kube-api-access-ss222\") pod \"dnsmasq-dns-7c6d9948dc-pgp2x\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.867940 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xjkgv"] Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.895359 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-cn5pw"] Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.896590 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.902851 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-cn5pw"] Nov 26 12:24:48 crc kubenswrapper[4834]: I1126 12:24:48.944674 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.031731 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-config\") pod \"dnsmasq-dns-6486446b9f-cn5pw\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.032113 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-dns-svc\") pod \"dnsmasq-dns-6486446b9f-cn5pw\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.032188 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7tww\" (UniqueName: \"kubernetes.io/projected/f2e8472a-0359-4818-b827-e901407fdcdf-kube-api-access-z7tww\") pod \"dnsmasq-dns-6486446b9f-cn5pw\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.133496 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-dns-svc\") pod \"dnsmasq-dns-6486446b9f-cn5pw\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.133593 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7tww\" (UniqueName: \"kubernetes.io/projected/f2e8472a-0359-4818-b827-e901407fdcdf-kube-api-access-z7tww\") pod \"dnsmasq-dns-6486446b9f-cn5pw\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.133752 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-config\") pod \"dnsmasq-dns-6486446b9f-cn5pw\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.134523 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-config\") pod \"dnsmasq-dns-6486446b9f-cn5pw\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.134524 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-dns-svc\") pod \"dnsmasq-dns-6486446b9f-cn5pw\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.156973 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7tww\" (UniqueName: \"kubernetes.io/projected/f2e8472a-0359-4818-b827-e901407fdcdf-kube-api-access-z7tww\") pod \"dnsmasq-dns-6486446b9f-cn5pw\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.216013 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.747819 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.748846 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.751285 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bnpxq" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.751362 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.751376 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.751794 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.752215 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.752279 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.752601 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.757991 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947525 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947640 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947681 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947704 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f30c5fe-7895-474e-a94d-967b23650025-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947729 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947767 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-config-data\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947803 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947821 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947866 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srwwd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-kube-api-access-srwwd\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947897 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:49 crc kubenswrapper[4834]: I1126 12:24:49.947927 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f30c5fe-7895-474e-a94d-967b23650025-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.017893 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.019422 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.023852 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.023892 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-w6st6" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.024097 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.024172 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.024333 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.027243 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.027657 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.032171 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049461 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f30c5fe-7895-474e-a94d-967b23650025-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049497 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049531 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-config-data\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049562 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049579 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049613 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srwwd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-kube-api-access-srwwd\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049642 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049668 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f30c5fe-7895-474e-a94d-967b23650025-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049684 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049738 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.049763 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.051981 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.052170 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.053157 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.053992 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.054761 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f30c5fe-7895-474e-a94d-967b23650025-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.056385 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-config-data\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.056974 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.057583 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.070194 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.070273 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srwwd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-kube-api-access-srwwd\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.075323 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f30c5fe-7895-474e-a94d-967b23650025-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.087569 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " pod="openstack/rabbitmq-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.151150 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25058ace-b8d8-4ac1-924a-844f3955f0fb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.151236 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.151266 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25058ace-b8d8-4ac1-924a-844f3955f0fb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.151472 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrvpl\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-kube-api-access-qrvpl\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.151621 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.151888 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.152011 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.152068 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.152119 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.152155 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.152207 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.254092 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.254389 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25058ace-b8d8-4ac1-924a-844f3955f0fb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.254525 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrvpl\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-kube-api-access-qrvpl\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.254660 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.254760 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.254875 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.254929 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.255009 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.255077 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.255149 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.255225 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.255321 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25058ace-b8d8-4ac1-924a-844f3955f0fb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.255506 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.255820 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.256176 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.256295 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.257209 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.257746 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25058ace-b8d8-4ac1-924a-844f3955f0fb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.259091 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.260441 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25058ace-b8d8-4ac1-924a-844f3955f0fb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.261261 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.274635 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrvpl\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-kube-api-access-qrvpl\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.279780 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.340389 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:24:50 crc kubenswrapper[4834]: I1126 12:24:50.377527 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.459410 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.460956 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.465455 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.465549 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.465843 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-fvhnv" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.466216 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.470022 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.473130 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.576396 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b694cd-3381-4f15-8d74-8cfc72753ae3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.576466 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.576507 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b694cd-3381-4f15-8d74-8cfc72753ae3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.576526 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b694cd-3381-4f15-8d74-8cfc72753ae3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.576565 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e6b694cd-3381-4f15-8d74-8cfc72753ae3-kolla-config\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.576584 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfkxp\" (UniqueName: \"kubernetes.io/projected/e6b694cd-3381-4f15-8d74-8cfc72753ae3-kube-api-access-pfkxp\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.576616 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e6b694cd-3381-4f15-8d74-8cfc72753ae3-config-data-default\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.576631 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e6b694cd-3381-4f15-8d74-8cfc72753ae3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.678068 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b694cd-3381-4f15-8d74-8cfc72753ae3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.679233 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.678286 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.680787 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b694cd-3381-4f15-8d74-8cfc72753ae3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.680831 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b694cd-3381-4f15-8d74-8cfc72753ae3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.680886 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e6b694cd-3381-4f15-8d74-8cfc72753ae3-kolla-config\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.680905 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfkxp\" (UniqueName: \"kubernetes.io/projected/e6b694cd-3381-4f15-8d74-8cfc72753ae3-kube-api-access-pfkxp\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.680952 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e6b694cd-3381-4f15-8d74-8cfc72753ae3-config-data-default\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.680983 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e6b694cd-3381-4f15-8d74-8cfc72753ae3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.689817 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b694cd-3381-4f15-8d74-8cfc72753ae3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.690071 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e6b694cd-3381-4f15-8d74-8cfc72753ae3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.690645 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e6b694cd-3381-4f15-8d74-8cfc72753ae3-kolla-config\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.691776 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b694cd-3381-4f15-8d74-8cfc72753ae3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.692165 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e6b694cd-3381-4f15-8d74-8cfc72753ae3-config-data-default\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.701580 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.701961 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b694cd-3381-4f15-8d74-8cfc72753ae3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.702931 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfkxp\" (UniqueName: \"kubernetes.io/projected/e6b694cd-3381-4f15-8d74-8cfc72753ae3-kube-api-access-pfkxp\") pod \"openstack-galera-0\" (UID: \"e6b694cd-3381-4f15-8d74-8cfc72753ae3\") " pod="openstack/openstack-galera-0" Nov 26 12:24:51 crc kubenswrapper[4834]: I1126 12:24:51.780496 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.766724 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.768239 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.771581 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2ltqg" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.772153 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.772340 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.772487 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.775474 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.903383 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c5ced50-7529-4c2a-822b-0a10cf6a9700-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.903440 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5ced50-7529-4c2a-822b-0a10cf6a9700-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.903512 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c5ced50-7529-4c2a-822b-0a10cf6a9700-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.903538 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c5ced50-7529-4c2a-822b-0a10cf6a9700-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.903566 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5ced50-7529-4c2a-822b-0a10cf6a9700-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.903664 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6p5q\" (UniqueName: \"kubernetes.io/projected/4c5ced50-7529-4c2a-822b-0a10cf6a9700-kube-api-access-r6p5q\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.903697 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5ced50-7529-4c2a-822b-0a10cf6a9700-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:52 crc kubenswrapper[4834]: I1126 12:24:52.903878 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.004829 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.004873 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c5ced50-7529-4c2a-822b-0a10cf6a9700-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.004889 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5ced50-7529-4c2a-822b-0a10cf6a9700-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.004915 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c5ced50-7529-4c2a-822b-0a10cf6a9700-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.004933 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c5ced50-7529-4c2a-822b-0a10cf6a9700-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.004954 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5ced50-7529-4c2a-822b-0a10cf6a9700-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.004984 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6p5q\" (UniqueName: \"kubernetes.io/projected/4c5ced50-7529-4c2a-822b-0a10cf6a9700-kube-api-access-r6p5q\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.005003 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5ced50-7529-4c2a-822b-0a10cf6a9700-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.005974 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c5ced50-7529-4c2a-822b-0a10cf6a9700-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.006007 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.006360 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c5ced50-7529-4c2a-822b-0a10cf6a9700-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.006633 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c5ced50-7529-4c2a-822b-0a10cf6a9700-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.011198 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5ced50-7529-4c2a-822b-0a10cf6a9700-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.011485 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5ced50-7529-4c2a-822b-0a10cf6a9700-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.020976 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5ced50-7529-4c2a-822b-0a10cf6a9700-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.026800 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6p5q\" (UniqueName: \"kubernetes.io/projected/4c5ced50-7529-4c2a-822b-0a10cf6a9700-kube-api-access-r6p5q\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.042388 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4c5ced50-7529-4c2a-822b-0a10cf6a9700\") " pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.087066 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.320041 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.321175 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.324025 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.324113 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vmdjw" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.326572 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.354456 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.411290 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4qb6\" (UniqueName: \"kubernetes.io/projected/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-kube-api-access-j4qb6\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.411423 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-memcached-tls-certs\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.411473 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-kolla-config\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.411643 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-config-data\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.411706 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-combined-ca-bundle\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.513039 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4qb6\" (UniqueName: \"kubernetes.io/projected/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-kube-api-access-j4qb6\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.513082 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-memcached-tls-certs\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.513119 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-kolla-config\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.513188 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-config-data\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.513224 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-combined-ca-bundle\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.514431 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-kolla-config\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.514833 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-config-data\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.518442 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-combined-ca-bundle\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.523984 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-memcached-tls-certs\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.526812 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4qb6\" (UniqueName: \"kubernetes.io/projected/614cfce6-4cb6-46ed-9012-a2ff7faf0a64-kube-api-access-j4qb6\") pod \"memcached-0\" (UID: \"614cfce6-4cb6-46ed-9012-a2ff7faf0a64\") " pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.653146 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 12:24:53 crc kubenswrapper[4834]: I1126 12:24:53.688842 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-cn5pw"] Nov 26 12:24:55 crc kubenswrapper[4834]: I1126 12:24:55.254292 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 12:24:55 crc kubenswrapper[4834]: I1126 12:24:55.255501 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 12:24:55 crc kubenswrapper[4834]: I1126 12:24:55.262432 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-k4md8" Nov 26 12:24:55 crc kubenswrapper[4834]: I1126 12:24:55.262989 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 12:24:55 crc kubenswrapper[4834]: I1126 12:24:55.350516 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jj7p\" (UniqueName: \"kubernetes.io/projected/fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8-kube-api-access-2jj7p\") pod \"kube-state-metrics-0\" (UID: \"fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8\") " pod="openstack/kube-state-metrics-0" Nov 26 12:24:55 crc kubenswrapper[4834]: I1126 12:24:55.452175 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jj7p\" (UniqueName: \"kubernetes.io/projected/fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8-kube-api-access-2jj7p\") pod \"kube-state-metrics-0\" (UID: \"fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8\") " pod="openstack/kube-state-metrics-0" Nov 26 12:24:55 crc kubenswrapper[4834]: I1126 12:24:55.473594 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jj7p\" (UniqueName: \"kubernetes.io/projected/fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8-kube-api-access-2jj7p\") pod \"kube-state-metrics-0\" (UID: \"fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8\") " pod="openstack/kube-state-metrics-0" Nov 26 12:24:55 crc kubenswrapper[4834]: I1126 12:24:55.599836 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 12:24:57 crc kubenswrapper[4834]: I1126 12:24:57.648041 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-pgp2x"] Nov 26 12:24:57 crc kubenswrapper[4834]: I1126 12:24:57.706480 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 12:24:57 crc kubenswrapper[4834]: W1126 12:24:57.961343 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f30c5fe_7895_474e_a94d_967b23650025.slice/crio-34c7b5e3955204d18fbe411bec268d57d14a4e8b27a972381f33f454e07e3d24 WatchSource:0}: Error finding container 34c7b5e3955204d18fbe411bec268d57d14a4e8b27a972381f33f454e07e3d24: Status 404 returned error can't find the container with id 34c7b5e3955204d18fbe411bec268d57d14a4e8b27a972381f33f454e07e3d24 Nov 26 12:24:57 crc kubenswrapper[4834]: W1126 12:24:57.962003 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod649f4660_441e_45a2_bdcd_ed292bf1d153.slice/crio-1d6eb89cc09edba0c4e2b645f9b881ae1a5817c5ab874e813d521707991d6933 WatchSource:0}: Error finding container 1d6eb89cc09edba0c4e2b645f9b881ae1a5817c5ab874e813d521707991d6933: Status 404 returned error can't find the container with id 1d6eb89cc09edba0c4e2b645f9b881ae1a5817c5ab874e813d521707991d6933 Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.161480 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" event={"ID":"649f4660-441e-45a2-bdcd-ed292bf1d153","Type":"ContainerStarted","Data":"1d6eb89cc09edba0c4e2b645f9b881ae1a5817c5ab874e813d521707991d6933"} Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.203051 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" event={"ID":"f2e8472a-0359-4818-b827-e901407fdcdf","Type":"ContainerStarted","Data":"0600faf6804ee7519833806a49e29073627ec192bebaa8f3c2d3eed6ded0dc18"} Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.218850 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f30c5fe-7895-474e-a94d-967b23650025","Type":"ContainerStarted","Data":"34c7b5e3955204d18fbe411bec268d57d14a4e8b27a972381f33f454e07e3d24"} Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.376952 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 12:24:58 crc kubenswrapper[4834]: W1126 12:24:58.379306 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25058ace_b8d8_4ac1_924a_844f3955f0fb.slice/crio-828a01a0e55a04224a4fad80a60f718afb6028c4968d95acd3b1f055da0f4e9d WatchSource:0}: Error finding container 828a01a0e55a04224a4fad80a60f718afb6028c4968d95acd3b1f055da0f4e9d: Status 404 returned error can't find the container with id 828a01a0e55a04224a4fad80a60f718afb6028c4968d95acd3b1f055da0f4e9d Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.470622 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.480685 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 12:24:58 crc kubenswrapper[4834]: W1126 12:24:58.531049 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod614cfce6_4cb6_46ed_9012_a2ff7faf0a64.slice/crio-b68a472286a7c8aec8784bb2b8dede1a6084d6bc41449ac31e4e9d8101d2695e WatchSource:0}: Error finding container b68a472286a7c8aec8784bb2b8dede1a6084d6bc41449ac31e4e9d8101d2695e: Status 404 returned error can't find the container with id b68a472286a7c8aec8784bb2b8dede1a6084d6bc41449ac31e4e9d8101d2695e Nov 26 12:24:58 crc kubenswrapper[4834]: W1126 12:24:58.531543 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b694cd_3381_4f15_8d74_8cfc72753ae3.slice/crio-cb8d6311dbcefa9d263b4b9efab524af5b4d46f89e50eda05e6d7bfd6ec4ca7e WatchSource:0}: Error finding container cb8d6311dbcefa9d263b4b9efab524af5b4d46f89e50eda05e6d7bfd6ec4ca7e: Status 404 returned error can't find the container with id cb8d6311dbcefa9d263b4b9efab524af5b4d46f89e50eda05e6d7bfd6ec4ca7e Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.592334 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.607548 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-n6tm2"] Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.608891 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.611844 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-msjjr" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.612400 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.613563 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.624074 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-n6tm2"] Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.639293 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.665735 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7b48f"] Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.680426 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.681637 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7b48f"] Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.714473 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5p2g\" (UniqueName: \"kubernetes.io/projected/b0ec582c-ead4-4350-ac7f-530f80804717-kube-api-access-j5p2g\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.714552 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ec582c-ead4-4350-ac7f-530f80804717-var-run-ovn\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.714653 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0ec582c-ead4-4350-ac7f-530f80804717-var-run\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.714690 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ec582c-ead4-4350-ac7f-530f80804717-combined-ca-bundle\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.714798 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ec582c-ead4-4350-ac7f-530f80804717-ovn-controller-tls-certs\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.714874 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0ec582c-ead4-4350-ac7f-530f80804717-scripts\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.714920 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ec582c-ead4-4350-ac7f-530f80804717-var-log-ovn\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.817473 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ec582c-ead4-4350-ac7f-530f80804717-var-run-ovn\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.817554 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-etc-ovs\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.817612 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc1e8363-c0e7-428e-b215-5d246d6c5094-scripts\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.817645 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k5tb\" (UniqueName: \"kubernetes.io/projected/dc1e8363-c0e7-428e-b215-5d246d6c5094-kube-api-access-6k5tb\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.817782 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-var-log\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.817858 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0ec582c-ead4-4350-ac7f-530f80804717-var-run\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.817909 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ec582c-ead4-4350-ac7f-530f80804717-combined-ca-bundle\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.817970 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ec582c-ead4-4350-ac7f-530f80804717-ovn-controller-tls-certs\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.818037 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0ec582c-ead4-4350-ac7f-530f80804717-scripts\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.818062 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-var-lib\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.818099 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-var-run\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.818125 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ec582c-ead4-4350-ac7f-530f80804717-var-log-ovn\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.818113 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ec582c-ead4-4350-ac7f-530f80804717-var-run-ovn\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.818438 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5p2g\" (UniqueName: \"kubernetes.io/projected/b0ec582c-ead4-4350-ac7f-530f80804717-kube-api-access-j5p2g\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.818543 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ec582c-ead4-4350-ac7f-530f80804717-var-log-ovn\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.818789 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0ec582c-ead4-4350-ac7f-530f80804717-var-run\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.820465 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0ec582c-ead4-4350-ac7f-530f80804717-scripts\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.824124 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ec582c-ead4-4350-ac7f-530f80804717-ovn-controller-tls-certs\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.824133 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ec582c-ead4-4350-ac7f-530f80804717-combined-ca-bundle\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.834829 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5p2g\" (UniqueName: \"kubernetes.io/projected/b0ec582c-ead4-4350-ac7f-530f80804717-kube-api-access-j5p2g\") pod \"ovn-controller-n6tm2\" (UID: \"b0ec582c-ead4-4350-ac7f-530f80804717\") " pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.920454 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-etc-ovs\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.920528 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc1e8363-c0e7-428e-b215-5d246d6c5094-scripts\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.920549 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k5tb\" (UniqueName: \"kubernetes.io/projected/dc1e8363-c0e7-428e-b215-5d246d6c5094-kube-api-access-6k5tb\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.920584 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-var-log\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.920694 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-var-lib\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.920722 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-var-run\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.920943 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-var-run\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.921105 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-etc-ovs\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.922916 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-var-log\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.922965 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dc1e8363-c0e7-428e-b215-5d246d6c5094-var-lib\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.923613 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc1e8363-c0e7-428e-b215-5d246d6c5094-scripts\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.931561 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-n6tm2" Nov 26 12:24:58 crc kubenswrapper[4834]: I1126 12:24:58.937384 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k5tb\" (UniqueName: \"kubernetes.io/projected/dc1e8363-c0e7-428e-b215-5d246d6c5094-kube-api-access-6k5tb\") pod \"ovn-controller-ovs-7b48f\" (UID: \"dc1e8363-c0e7-428e-b215-5d246d6c5094\") " pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.011956 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.229753 4834 generic.go:334] "Generic (PLEG): container finished" podID="649f4660-441e-45a2-bdcd-ed292bf1d153" containerID="74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211" exitCode=0 Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.229866 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" event={"ID":"649f4660-441e-45a2-bdcd-ed292bf1d153","Type":"ContainerDied","Data":"74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211"} Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.232557 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"614cfce6-4cb6-46ed-9012-a2ff7faf0a64","Type":"ContainerStarted","Data":"b68a472286a7c8aec8784bb2b8dede1a6084d6bc41449ac31e4e9d8101d2695e"} Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.234497 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e6b694cd-3381-4f15-8d74-8cfc72753ae3","Type":"ContainerStarted","Data":"cb8d6311dbcefa9d263b4b9efab524af5b4d46f89e50eda05e6d7bfd6ec4ca7e"} Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.236262 4834 generic.go:334] "Generic (PLEG): container finished" podID="f2e8472a-0359-4818-b827-e901407fdcdf" containerID="fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250" exitCode=0 Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.236343 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" event={"ID":"f2e8472a-0359-4818-b827-e901407fdcdf","Type":"ContainerDied","Data":"fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250"} Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.238563 4834 generic.go:334] "Generic (PLEG): container finished" podID="ebfdfa9a-28df-482e-a947-1972ebe7e7d8" containerID="c826f85f3903d6a089737ad54a6333df3905122d5665618f74f9353d8ea03ffa" exitCode=0 Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.238620 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6584b49599-twv8t" event={"ID":"ebfdfa9a-28df-482e-a947-1972ebe7e7d8","Type":"ContainerDied","Data":"c826f85f3903d6a089737ad54a6333df3905122d5665618f74f9353d8ea03ffa"} Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.239807 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4c5ced50-7529-4c2a-822b-0a10cf6a9700","Type":"ContainerStarted","Data":"227e621ea0c03f524bcd68f71bb3cb84f3f797f2ea06df08387df24710af7e4a"} Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.241047 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8","Type":"ContainerStarted","Data":"98e7852bca5a3091f41a9904fb88501161dcc52d7741319c20ab81893865e18c"} Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.242472 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"25058ace-b8d8-4ac1-924a-844f3955f0fb","Type":"ContainerStarted","Data":"828a01a0e55a04224a4fad80a60f718afb6028c4968d95acd3b1f055da0f4e9d"} Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.246033 4834 generic.go:334] "Generic (PLEG): container finished" podID="07d60422-877a-4527-8ce0-c4d20cdba117" containerID="c6e0b0dead758db7badabd6718ca875348aa841c68278afcc5b0a396e401eae5" exitCode=0 Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.246110 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" event={"ID":"07d60422-877a-4527-8ce0-c4d20cdba117","Type":"ContainerDied","Data":"c6e0b0dead758db7badabd6718ca875348aa841c68278afcc5b0a396e401eae5"} Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.351734 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-n6tm2"] Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.587822 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7b48f"] Nov 26 12:24:59 crc kubenswrapper[4834]: W1126 12:24:59.726766 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc1e8363_c0e7_428e_b215_5d246d6c5094.slice/crio-34bbe63e04a5aecf657b50abc00ffcec2ad3dbec5f1a7803db0d76df118510f5 WatchSource:0}: Error finding container 34bbe63e04a5aecf657b50abc00ffcec2ad3dbec5f1a7803db0d76df118510f5: Status 404 returned error can't find the container with id 34bbe63e04a5aecf657b50abc00ffcec2ad3dbec5f1a7803db0d76df118510f5 Nov 26 12:24:59 crc kubenswrapper[4834]: W1126 12:24:59.729453 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0ec582c_ead4_4350_ac7f_530f80804717.slice/crio-42ff345a5caef7710bb14774dc356a9354d772e32257d646c63ae5180fd5789c WatchSource:0}: Error finding container 42ff345a5caef7710bb14774dc356a9354d772e32257d646c63ae5180fd5789c: Status 404 returned error can't find the container with id 42ff345a5caef7710bb14774dc356a9354d772e32257d646c63ae5180fd5789c Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.801367 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.816297 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.882701 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-n9hx4"] Nov 26 12:24:59 crc kubenswrapper[4834]: E1126 12:24:59.883016 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfdfa9a-28df-482e-a947-1972ebe7e7d8" containerName="init" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.883029 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfdfa9a-28df-482e-a947-1972ebe7e7d8" containerName="init" Nov 26 12:24:59 crc kubenswrapper[4834]: E1126 12:24:59.883051 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d60422-877a-4527-8ce0-c4d20cdba117" containerName="init" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.883056 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d60422-877a-4527-8ce0-c4d20cdba117" containerName="init" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.883213 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d60422-877a-4527-8ce0-c4d20cdba117" containerName="init" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.883233 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebfdfa9a-28df-482e-a947-1972ebe7e7d8" containerName="init" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.883763 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.885544 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.885856 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.893284 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n9hx4"] Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.946739 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-config\") pod \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.946813 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql2lh\" (UniqueName: \"kubernetes.io/projected/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-kube-api-access-ql2lh\") pod \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.946836 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f5cw\" (UniqueName: \"kubernetes.io/projected/07d60422-877a-4527-8ce0-c4d20cdba117-kube-api-access-4f5cw\") pod \"07d60422-877a-4527-8ce0-c4d20cdba117\" (UID: \"07d60422-877a-4527-8ce0-c4d20cdba117\") " Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.946920 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-dns-svc\") pod \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\" (UID: \"ebfdfa9a-28df-482e-a947-1972ebe7e7d8\") " Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.946956 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d60422-877a-4527-8ce0-c4d20cdba117-config\") pod \"07d60422-877a-4527-8ce0-c4d20cdba117\" (UID: \"07d60422-877a-4527-8ce0-c4d20cdba117\") " Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.951051 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-kube-api-access-ql2lh" (OuterVolumeSpecName: "kube-api-access-ql2lh") pod "ebfdfa9a-28df-482e-a947-1972ebe7e7d8" (UID: "ebfdfa9a-28df-482e-a947-1972ebe7e7d8"). InnerVolumeSpecName "kube-api-access-ql2lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.952193 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d60422-877a-4527-8ce0-c4d20cdba117-kube-api-access-4f5cw" (OuterVolumeSpecName: "kube-api-access-4f5cw") pod "07d60422-877a-4527-8ce0-c4d20cdba117" (UID: "07d60422-877a-4527-8ce0-c4d20cdba117"). InnerVolumeSpecName "kube-api-access-4f5cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.964859 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ebfdfa9a-28df-482e-a947-1972ebe7e7d8" (UID: "ebfdfa9a-28df-482e-a947-1972ebe7e7d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.966479 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-config" (OuterVolumeSpecName: "config") pod "ebfdfa9a-28df-482e-a947-1972ebe7e7d8" (UID: "ebfdfa9a-28df-482e-a947-1972ebe7e7d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:24:59 crc kubenswrapper[4834]: I1126 12:24:59.969837 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d60422-877a-4527-8ce0-c4d20cdba117-config" (OuterVolumeSpecName: "config") pod "07d60422-877a-4527-8ce0-c4d20cdba117" (UID: "07d60422-877a-4527-8ce0-c4d20cdba117"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.048642 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2217b645-a751-42b8-be14-6587b294bf48-combined-ca-bundle\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.048691 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2217b645-a751-42b8-be14-6587b294bf48-config\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.048728 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxv9j\" (UniqueName: \"kubernetes.io/projected/2217b645-a751-42b8-be14-6587b294bf48-kube-api-access-fxv9j\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.048892 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2217b645-a751-42b8-be14-6587b294bf48-ovn-rundir\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.048978 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2217b645-a751-42b8-be14-6587b294bf48-ovs-rundir\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.049118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2217b645-a751-42b8-be14-6587b294bf48-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.049254 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.049274 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d60422-877a-4527-8ce0-c4d20cdba117-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.049283 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.049293 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql2lh\" (UniqueName: \"kubernetes.io/projected/ebfdfa9a-28df-482e-a947-1972ebe7e7d8-kube-api-access-ql2lh\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.049304 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f5cw\" (UniqueName: \"kubernetes.io/projected/07d60422-877a-4527-8ce0-c4d20cdba117-kube-api-access-4f5cw\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.151041 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2217b645-a751-42b8-be14-6587b294bf48-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.151344 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2217b645-a751-42b8-be14-6587b294bf48-combined-ca-bundle\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.151402 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2217b645-a751-42b8-be14-6587b294bf48-config\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.151439 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxv9j\" (UniqueName: \"kubernetes.io/projected/2217b645-a751-42b8-be14-6587b294bf48-kube-api-access-fxv9j\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.151482 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2217b645-a751-42b8-be14-6587b294bf48-ovn-rundir\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.151513 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2217b645-a751-42b8-be14-6587b294bf48-ovs-rundir\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.151920 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2217b645-a751-42b8-be14-6587b294bf48-ovs-rundir\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.151896 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2217b645-a751-42b8-be14-6587b294bf48-ovn-rundir\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.152212 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2217b645-a751-42b8-be14-6587b294bf48-config\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.155169 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2217b645-a751-42b8-be14-6587b294bf48-combined-ca-bundle\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.156274 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2217b645-a751-42b8-be14-6587b294bf48-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.169882 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxv9j\" (UniqueName: \"kubernetes.io/projected/2217b645-a751-42b8-be14-6587b294bf48-kube-api-access-fxv9j\") pod \"ovn-controller-metrics-n9hx4\" (UID: \"2217b645-a751-42b8-be14-6587b294bf48\") " pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.202677 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n9hx4" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.260202 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" event={"ID":"07d60422-877a-4527-8ce0-c4d20cdba117","Type":"ContainerDied","Data":"e00e5b123ff1c9fd70278d2ee8a9b504c70883a750fbc2c2e661ad258e1b396c"} Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.260597 4834 scope.go:117] "RemoveContainer" containerID="c6e0b0dead758db7badabd6718ca875348aa841c68278afcc5b0a396e401eae5" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.260703 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-xjkgv" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.264897 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-n6tm2" event={"ID":"b0ec582c-ead4-4350-ac7f-530f80804717","Type":"ContainerStarted","Data":"42ff345a5caef7710bb14774dc356a9354d772e32257d646c63ae5180fd5789c"} Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.269465 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6584b49599-twv8t" event={"ID":"ebfdfa9a-28df-482e-a947-1972ebe7e7d8","Type":"ContainerDied","Data":"7684c3ca2926f7c7d561c17a180ecc1040137360446b9416c95d9c5a94bc7de0"} Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.269532 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-twv8t" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.272741 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7b48f" event={"ID":"dc1e8363-c0e7-428e-b215-5d246d6c5094","Type":"ContainerStarted","Data":"34bbe63e04a5aecf657b50abc00ffcec2ad3dbec5f1a7803db0d76df118510f5"} Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.336523 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-twv8t"] Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.343630 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-twv8t"] Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.351529 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xjkgv"] Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.355065 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xjkgv"] Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.428162 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d60422-877a-4527-8ce0-c4d20cdba117" path="/var/lib/kubelet/pods/07d60422-877a-4527-8ce0-c4d20cdba117/volumes" Nov 26 12:25:00 crc kubenswrapper[4834]: I1126 12:25:00.428746 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfdfa9a-28df-482e-a947-1972ebe7e7d8" path="/var/lib/kubelet/pods/ebfdfa9a-28df-482e-a947-1972ebe7e7d8/volumes" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.107750 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.125188 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.127646 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.127936 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.128083 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-n6gzb" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.128221 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.133754 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.295488 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a511537-2e50-4f68-9c68-dcb20e489cb3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.295533 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq4qp\" (UniqueName: \"kubernetes.io/projected/6a511537-2e50-4f68-9c68-dcb20e489cb3-kube-api-access-bq4qp\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.295572 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a511537-2e50-4f68-9c68-dcb20e489cb3-config\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.295602 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.295641 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a511537-2e50-4f68-9c68-dcb20e489cb3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.295808 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a511537-2e50-4f68-9c68-dcb20e489cb3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.295920 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a511537-2e50-4f68-9c68-dcb20e489cb3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.295979 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a511537-2e50-4f68-9c68-dcb20e489cb3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.311540 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.316941 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.318646 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-kzxzq" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.318759 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.318860 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.319090 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.324623 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.397733 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a511537-2e50-4f68-9c68-dcb20e489cb3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.397793 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a511537-2e50-4f68-9c68-dcb20e489cb3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.397888 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a511537-2e50-4f68-9c68-dcb20e489cb3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.397913 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq4qp\" (UniqueName: \"kubernetes.io/projected/6a511537-2e50-4f68-9c68-dcb20e489cb3-kube-api-access-bq4qp\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.397962 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a511537-2e50-4f68-9c68-dcb20e489cb3-config\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.398017 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.398073 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a511537-2e50-4f68-9c68-dcb20e489cb3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.398132 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a511537-2e50-4f68-9c68-dcb20e489cb3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.398155 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a511537-2e50-4f68-9c68-dcb20e489cb3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.398570 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.399230 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a511537-2e50-4f68-9c68-dcb20e489cb3-config\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.400489 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a511537-2e50-4f68-9c68-dcb20e489cb3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.406013 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a511537-2e50-4f68-9c68-dcb20e489cb3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.406922 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a511537-2e50-4f68-9c68-dcb20e489cb3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.410234 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a511537-2e50-4f68-9c68-dcb20e489cb3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.414593 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq4qp\" (UniqueName: \"kubernetes.io/projected/6a511537-2e50-4f68-9c68-dcb20e489cb3-kube-api-access-bq4qp\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.416010 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6a511537-2e50-4f68-9c68-dcb20e489cb3\") " pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.451810 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.499874 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.500548 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.500624 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.500649 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.500695 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5k77\" (UniqueName: \"kubernetes.io/projected/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-kube-api-access-w5k77\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.500715 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-config\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.500768 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.500789 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.602497 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5k77\" (UniqueName: \"kubernetes.io/projected/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-kube-api-access-w5k77\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.602561 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-config\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.602595 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.602623 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.602690 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.602786 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.602854 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.602883 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.604211 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.604460 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-config\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.605365 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.606104 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.607420 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.607930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.616971 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.619538 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5k77\" (UniqueName: \"kubernetes.io/projected/2b8fa876-0579-4ff1-be1c-4a45969fa4ae-kube-api-access-w5k77\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.631337 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2b8fa876-0579-4ff1-be1c-4a45969fa4ae\") " pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:02 crc kubenswrapper[4834]: I1126 12:25:02.637204 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:04 crc kubenswrapper[4834]: I1126 12:25:04.402603 4834 scope.go:117] "RemoveContainer" containerID="c826f85f3903d6a089737ad54a6333df3905122d5665618f74f9353d8ea03ffa" Nov 26 12:25:05 crc kubenswrapper[4834]: I1126 12:25:05.974867 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n9hx4"] Nov 26 12:25:06 crc kubenswrapper[4834]: W1126 12:25:06.007014 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2217b645_a751_42b8_be14_6587b294bf48.slice/crio-4280ec4e8caceec0cb0b666d861d44c0670c552cc902aa5294f8082844199d12 WatchSource:0}: Error finding container 4280ec4e8caceec0cb0b666d861d44c0670c552cc902aa5294f8082844199d12: Status 404 returned error can't find the container with id 4280ec4e8caceec0cb0b666d861d44c0670c552cc902aa5294f8082844199d12 Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.060154 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.152178 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 12:25:06 crc kubenswrapper[4834]: W1126 12:25:06.189910 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b8fa876_0579_4ff1_be1c_4a45969fa4ae.slice/crio-8a2d6f1b6fb6ca461ee2a1c14cf4bbfa67e5042c37ad6eed15ef8cc2627bd020 WatchSource:0}: Error finding container 8a2d6f1b6fb6ca461ee2a1c14cf4bbfa67e5042c37ad6eed15ef8cc2627bd020: Status 404 returned error can't find the container with id 8a2d6f1b6fb6ca461ee2a1c14cf4bbfa67e5042c37ad6eed15ef8cc2627bd020 Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.337178 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6a511537-2e50-4f68-9c68-dcb20e489cb3","Type":"ContainerStarted","Data":"a2acf6de0e53f818913e9bc59ddf7cdcd10fc7f327611c7f518e1964fcf77e29"} Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.338922 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2b8fa876-0579-4ff1-be1c-4a45969fa4ae","Type":"ContainerStarted","Data":"8a2d6f1b6fb6ca461ee2a1c14cf4bbfa67e5042c37ad6eed15ef8cc2627bd020"} Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.341904 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" event={"ID":"f2e8472a-0359-4818-b827-e901407fdcdf","Type":"ContainerStarted","Data":"eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e"} Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.342051 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.346518 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" event={"ID":"649f4660-441e-45a2-bdcd-ed292bf1d153","Type":"ContainerStarted","Data":"55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce"} Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.346724 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.347792 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n9hx4" event={"ID":"2217b645-a751-42b8-be14-6587b294bf48","Type":"ContainerStarted","Data":"4280ec4e8caceec0cb0b666d861d44c0670c552cc902aa5294f8082844199d12"} Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.367735 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" podStartSLOduration=17.580561491 podStartE2EDuration="18.367714939s" podCreationTimestamp="2025-11-26 12:24:48 +0000 UTC" firstStartedPulling="2025-11-26 12:24:57.292681409 +0000 UTC m=+795.199894760" lastFinishedPulling="2025-11-26 12:24:58.079834857 +0000 UTC m=+795.987048208" observedRunningTime="2025-11-26 12:25:06.357948321 +0000 UTC m=+804.265161673" watchObservedRunningTime="2025-11-26 12:25:06.367714939 +0000 UTC m=+804.274928291" Nov 26 12:25:06 crc kubenswrapper[4834]: I1126 12:25:06.392815 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" podStartSLOduration=17.741962559 podStartE2EDuration="18.392800234s" podCreationTimestamp="2025-11-26 12:24:48 +0000 UTC" firstStartedPulling="2025-11-26 12:24:57.975646269 +0000 UTC m=+795.882859621" lastFinishedPulling="2025-11-26 12:24:58.626483943 +0000 UTC m=+796.533697296" observedRunningTime="2025-11-26 12:25:06.391213101 +0000 UTC m=+804.298426453" watchObservedRunningTime="2025-11-26 12:25:06.392800234 +0000 UTC m=+804.300013587" Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.361254 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e6b694cd-3381-4f15-8d74-8cfc72753ae3","Type":"ContainerStarted","Data":"c8aee7a683cad3ddbd92f0e6fdcb8289a9f7780c6ef0ebfbcbba002f51dd840b"} Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.363858 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f30c5fe-7895-474e-a94d-967b23650025","Type":"ContainerStarted","Data":"f2360140e760e2b0540dfc06873720e62ea3d3673fe4b602b6263cef14fad59a"} Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.365748 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4c5ced50-7529-4c2a-822b-0a10cf6a9700","Type":"ContainerStarted","Data":"ccf1ef71ad3967c46fe1883646f07a04c7e30926c784b7a0eaa378ebb3f0d80b"} Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.367145 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8","Type":"ContainerStarted","Data":"5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b"} Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.367326 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.368549 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"614cfce6-4cb6-46ed-9012-a2ff7faf0a64","Type":"ContainerStarted","Data":"a1f88270b78425d59655020e0066febb50dab6aa1f3869fb6495de4663d31d9c"} Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.368612 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.370128 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"25058ace-b8d8-4ac1-924a-844f3955f0fb","Type":"ContainerStarted","Data":"2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c"} Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.371717 4834 generic.go:334] "Generic (PLEG): container finished" podID="dc1e8363-c0e7-428e-b215-5d246d6c5094" containerID="c0dfe62feaa821965d588c80a1e030bd6e832b75dee40d0118eacd1e092ef9c3" exitCode=0 Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.371800 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7b48f" event={"ID":"dc1e8363-c0e7-428e-b215-5d246d6c5094","Type":"ContainerDied","Data":"c0dfe62feaa821965d588c80a1e030bd6e832b75dee40d0118eacd1e092ef9c3"} Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.373500 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-n6tm2" event={"ID":"b0ec582c-ead4-4350-ac7f-530f80804717","Type":"ContainerStarted","Data":"d86012e037a8695434008c72cea5e0f4df6a7fdb8f2b94db1bb00acbb732b0c1"} Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.373579 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-n6tm2" Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.433598 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=5.46218236 podStartE2EDuration="12.433572025s" podCreationTimestamp="2025-11-26 12:24:55 +0000 UTC" firstStartedPulling="2025-11-26 12:24:58.606887989 +0000 UTC m=+796.514101341" lastFinishedPulling="2025-11-26 12:25:05.578277654 +0000 UTC m=+803.485491006" observedRunningTime="2025-11-26 12:25:07.420927148 +0000 UTC m=+805.328140501" watchObservedRunningTime="2025-11-26 12:25:07.433572025 +0000 UTC m=+805.340785376" Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.477050 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-n6tm2" podStartSLOduration=3.600358564 podStartE2EDuration="9.477029075s" podCreationTimestamp="2025-11-26 12:24:58 +0000 UTC" firstStartedPulling="2025-11-26 12:24:59.731251866 +0000 UTC m=+797.638465219" lastFinishedPulling="2025-11-26 12:25:05.607922378 +0000 UTC m=+803.515135730" observedRunningTime="2025-11-26 12:25:07.475581335 +0000 UTC m=+805.382794688" watchObservedRunningTime="2025-11-26 12:25:07.477029075 +0000 UTC m=+805.384242427" Nov 26 12:25:07 crc kubenswrapper[4834]: I1126 12:25:07.500480 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=7.714841502 podStartE2EDuration="14.500456133s" podCreationTimestamp="2025-11-26 12:24:53 +0000 UTC" firstStartedPulling="2025-11-26 12:24:58.535530013 +0000 UTC m=+796.442743366" lastFinishedPulling="2025-11-26 12:25:05.321144644 +0000 UTC m=+803.228357997" observedRunningTime="2025-11-26 12:25:07.491657101 +0000 UTC m=+805.398870453" watchObservedRunningTime="2025-11-26 12:25:07.500456133 +0000 UTC m=+805.407669486" Nov 26 12:25:08 crc kubenswrapper[4834]: I1126 12:25:08.381243 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6a511537-2e50-4f68-9c68-dcb20e489cb3","Type":"ContainerStarted","Data":"220167cde64bb774bd38876609bb1c4bba4e95c205aece42e5edfc2d33509cc2"} Nov 26 12:25:08 crc kubenswrapper[4834]: I1126 12:25:08.384482 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7b48f" event={"ID":"dc1e8363-c0e7-428e-b215-5d246d6c5094","Type":"ContainerStarted","Data":"76de2ac4ff8b937433657bcd72b7c05bafdf7041de7fe8b8c679582f25f3fd92"} Nov 26 12:25:09 crc kubenswrapper[4834]: I1126 12:25:09.405414 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2b8fa876-0579-4ff1-be1c-4a45969fa4ae","Type":"ContainerStarted","Data":"cfb526f51588c4b0b91531b1e9bef4ce4477a3e35139c5e7be627c0162a138ee"} Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.418484 4834 generic.go:334] "Generic (PLEG): container finished" podID="e6b694cd-3381-4f15-8d74-8cfc72753ae3" containerID="c8aee7a683cad3ddbd92f0e6fdcb8289a9f7780c6ef0ebfbcbba002f51dd840b" exitCode=0 Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.421205 4834 generic.go:334] "Generic (PLEG): container finished" podID="4c5ced50-7529-4c2a-822b-0a10cf6a9700" containerID="ccf1ef71ad3967c46fe1883646f07a04c7e30926c784b7a0eaa378ebb3f0d80b" exitCode=0 Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.426453 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e6b694cd-3381-4f15-8d74-8cfc72753ae3","Type":"ContainerDied","Data":"c8aee7a683cad3ddbd92f0e6fdcb8289a9f7780c6ef0ebfbcbba002f51dd840b"} Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.426498 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4c5ced50-7529-4c2a-822b-0a10cf6a9700","Type":"ContainerDied","Data":"ccf1ef71ad3967c46fe1883646f07a04c7e30926c784b7a0eaa378ebb3f0d80b"} Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.426513 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n9hx4" event={"ID":"2217b645-a751-42b8-be14-6587b294bf48","Type":"ContainerStarted","Data":"3ec67896e93be1289143d110bc845def57dabb30e94ef4a584c1d55bd88ba0a3"} Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.427574 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7b48f" event={"ID":"dc1e8363-c0e7-428e-b215-5d246d6c5094","Type":"ContainerStarted","Data":"518d6cf119eb89e41ecbcba04bac736188e095d7a70d156c5bf7a0fde3a40fdf"} Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.427697 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.429490 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6a511537-2e50-4f68-9c68-dcb20e489cb3","Type":"ContainerStarted","Data":"8adfb7003ef08d7ec70c0dd987d58208fe25ea2d895f2c5353767b48bd1aca39"} Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.431234 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2b8fa876-0579-4ff1-be1c-4a45969fa4ae","Type":"ContainerStarted","Data":"abe87d731551e83a2e384c60dbab13960965c2d29f47db443c57c6511ac7feac"} Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.444999 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-n9hx4" podStartSLOduration=8.64988753 podStartE2EDuration="11.444977711s" podCreationTimestamp="2025-11-26 12:24:59 +0000 UTC" firstStartedPulling="2025-11-26 12:25:06.011595391 +0000 UTC m=+803.918808743" lastFinishedPulling="2025-11-26 12:25:08.806685571 +0000 UTC m=+806.713898924" observedRunningTime="2025-11-26 12:25:10.442125943 +0000 UTC m=+808.349339295" watchObservedRunningTime="2025-11-26 12:25:10.444977711 +0000 UTC m=+808.352191062" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.486172 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.6435075470000005 podStartE2EDuration="9.486153129s" podCreationTimestamp="2025-11-26 12:25:01 +0000 UTC" firstStartedPulling="2025-11-26 12:25:06.192699847 +0000 UTC m=+804.099913199" lastFinishedPulling="2025-11-26 12:25:10.03534543 +0000 UTC m=+807.942558781" observedRunningTime="2025-11-26 12:25:10.477483049 +0000 UTC m=+808.384696421" watchObservedRunningTime="2025-11-26 12:25:10.486153129 +0000 UTC m=+808.393366481" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.525331 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.014254079 podStartE2EDuration="9.525298488s" podCreationTimestamp="2025-11-26 12:25:01 +0000 UTC" firstStartedPulling="2025-11-26 12:25:06.295686528 +0000 UTC m=+804.202899870" lastFinishedPulling="2025-11-26 12:25:08.806730927 +0000 UTC m=+806.713944279" observedRunningTime="2025-11-26 12:25:10.497339815 +0000 UTC m=+808.404553166" watchObservedRunningTime="2025-11-26 12:25:10.525298488 +0000 UTC m=+808.432511840" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.544526 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7b48f" podStartSLOduration=6.711005218 podStartE2EDuration="12.544509807s" podCreationTimestamp="2025-11-26 12:24:58 +0000 UTC" firstStartedPulling="2025-11-26 12:24:59.746743601 +0000 UTC m=+797.653956953" lastFinishedPulling="2025-11-26 12:25:05.580248189 +0000 UTC m=+803.487461542" observedRunningTime="2025-11-26 12:25:10.53761795 +0000 UTC m=+808.444831302" watchObservedRunningTime="2025-11-26 12:25:10.544509807 +0000 UTC m=+808.451723159" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.615899 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-cn5pw"] Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.616132 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" podUID="f2e8472a-0359-4818-b827-e901407fdcdf" containerName="dnsmasq-dns" containerID="cri-o://eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e" gracePeriod=10 Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.641597 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-nnhrc"] Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.643101 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.647018 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.650095 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-nnhrc"] Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.657278 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-config\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.657407 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b8gr\" (UniqueName: \"kubernetes.io/projected/53846057-6871-4d5b-95ec-b642c93f911c-kube-api-access-6b8gr\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.657491 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-ovsdbserver-nb\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.657537 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-dns-svc\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.759384 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-dns-svc\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.759539 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-config\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.759629 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b8gr\" (UniqueName: \"kubernetes.io/projected/53846057-6871-4d5b-95ec-b642c93f911c-kube-api-access-6b8gr\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.759699 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-ovsdbserver-nb\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.760822 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-ovsdbserver-nb\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.760869 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-config\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.761459 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-dns-svc\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.778820 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b8gr\" (UniqueName: \"kubernetes.io/projected/53846057-6871-4d5b-95ec-b642c93f911c-kube-api-access-6b8gr\") pod \"dnsmasq-dns-65c78595c5-nnhrc\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.812476 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-pgp2x"] Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.812735 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" podUID="649f4660-441e-45a2-bdcd-ed292bf1d153" containerName="dnsmasq-dns" containerID="cri-o://55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce" gracePeriod=10 Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.844242 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-nxj5r"] Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.846151 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.850022 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.855434 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-nxj5r"] Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.860249 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-config\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.860283 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-dns-svc\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.860302 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.860364 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.860389 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcb4m\" (UniqueName: \"kubernetes.io/projected/2fcb2f84-372f-4136-a20b-79a4388eda80-kube-api-access-jcb4m\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: E1126 12:25:10.891845 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod649f4660_441e_45a2_bdcd_ed292bf1d153.slice/crio-conmon-55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce.scope\": RecentStats: unable to find data in memory cache]" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.961830 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-config\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.962141 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-dns-svc\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.962165 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.962246 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.962288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcb4m\" (UniqueName: \"kubernetes.io/projected/2fcb2f84-372f-4136-a20b-79a4388eda80-kube-api-access-jcb4m\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.963375 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-config\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.963619 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-dns-svc\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.963849 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.965574 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:10 crc kubenswrapper[4834]: I1126 12:25:10.977069 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcb4m\" (UniqueName: \"kubernetes.io/projected/2fcb2f84-372f-4136-a20b-79a4388eda80-kube-api-access-jcb4m\") pod \"dnsmasq-dns-5c7b6b5695-nxj5r\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.002055 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.019147 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.066750 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-dns-svc\") pod \"f2e8472a-0359-4818-b827-e901407fdcdf\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.066825 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-config\") pod \"f2e8472a-0359-4818-b827-e901407fdcdf\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.109267 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-config" (OuterVolumeSpecName: "config") pod "f2e8472a-0359-4818-b827-e901407fdcdf" (UID: "f2e8472a-0359-4818-b827-e901407fdcdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.109848 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2e8472a-0359-4818-b827-e901407fdcdf" (UID: "f2e8472a-0359-4818-b827-e901407fdcdf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.168050 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7tww\" (UniqueName: \"kubernetes.io/projected/f2e8472a-0359-4818-b827-e901407fdcdf-kube-api-access-z7tww\") pod \"f2e8472a-0359-4818-b827-e901407fdcdf\" (UID: \"f2e8472a-0359-4818-b827-e901407fdcdf\") " Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.168635 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.168650 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e8472a-0359-4818-b827-e901407fdcdf-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.172711 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e8472a-0359-4818-b827-e901407fdcdf-kube-api-access-z7tww" (OuterVolumeSpecName: "kube-api-access-z7tww") pod "f2e8472a-0359-4818-b827-e901407fdcdf" (UID: "f2e8472a-0359-4818-b827-e901407fdcdf"). InnerVolumeSpecName "kube-api-access-z7tww". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.203764 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.238229 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.270029 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-dns-svc\") pod \"649f4660-441e-45a2-bdcd-ed292bf1d153\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.270147 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-config\") pod \"649f4660-441e-45a2-bdcd-ed292bf1d153\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.270277 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss222\" (UniqueName: \"kubernetes.io/projected/649f4660-441e-45a2-bdcd-ed292bf1d153-kube-api-access-ss222\") pod \"649f4660-441e-45a2-bdcd-ed292bf1d153\" (UID: \"649f4660-441e-45a2-bdcd-ed292bf1d153\") " Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.270822 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7tww\" (UniqueName: \"kubernetes.io/projected/f2e8472a-0359-4818-b827-e901407fdcdf-kube-api-access-z7tww\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.274865 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649f4660-441e-45a2-bdcd-ed292bf1d153-kube-api-access-ss222" (OuterVolumeSpecName: "kube-api-access-ss222") pod "649f4660-441e-45a2-bdcd-ed292bf1d153" (UID: "649f4660-441e-45a2-bdcd-ed292bf1d153"). InnerVolumeSpecName "kube-api-access-ss222". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.301985 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-config" (OuterVolumeSpecName: "config") pod "649f4660-441e-45a2-bdcd-ed292bf1d153" (UID: "649f4660-441e-45a2-bdcd-ed292bf1d153"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.303046 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "649f4660-441e-45a2-bdcd-ed292bf1d153" (UID: "649f4660-441e-45a2-bdcd-ed292bf1d153"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.371833 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.371858 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/649f4660-441e-45a2-bdcd-ed292bf1d153-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.371869 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss222\" (UniqueName: \"kubernetes.io/projected/649f4660-441e-45a2-bdcd-ed292bf1d153-kube-api-access-ss222\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.443032 4834 generic.go:334] "Generic (PLEG): container finished" podID="649f4660-441e-45a2-bdcd-ed292bf1d153" containerID="55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce" exitCode=0 Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.443106 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" event={"ID":"649f4660-441e-45a2-bdcd-ed292bf1d153","Type":"ContainerDied","Data":"55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce"} Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.443141 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" event={"ID":"649f4660-441e-45a2-bdcd-ed292bf1d153","Type":"ContainerDied","Data":"1d6eb89cc09edba0c4e2b645f9b881ae1a5817c5ab874e813d521707991d6933"} Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.443181 4834 scope.go:117] "RemoveContainer" containerID="55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.443302 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-pgp2x" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.447210 4834 generic.go:334] "Generic (PLEG): container finished" podID="f2e8472a-0359-4818-b827-e901407fdcdf" containerID="eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e" exitCode=0 Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.447284 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.447324 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" event={"ID":"f2e8472a-0359-4818-b827-e901407fdcdf","Type":"ContainerDied","Data":"eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e"} Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.447379 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-cn5pw" event={"ID":"f2e8472a-0359-4818-b827-e901407fdcdf","Type":"ContainerDied","Data":"0600faf6804ee7519833806a49e29073627ec192bebaa8f3c2d3eed6ded0dc18"} Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.451488 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e6b694cd-3381-4f15-8d74-8cfc72753ae3","Type":"ContainerStarted","Data":"59fa54dafce870269c4dcdd333d162a51e84c742d0da7b77f7f3b99f403e3477"} Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.452165 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.455102 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4c5ced50-7529-4c2a-822b-0a10cf6a9700","Type":"ContainerStarted","Data":"7d59b1e8653d352fe80f20f5b79fe97089c010deabfb31bf84d88ae99c83b93d"} Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.456430 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.459785 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-nnhrc"] Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.465932 4834 scope.go:117] "RemoveContainer" containerID="74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.476478 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=14.432625983 podStartE2EDuration="21.476467055s" podCreationTimestamp="2025-11-26 12:24:50 +0000 UTC" firstStartedPulling="2025-11-26 12:24:58.537938085 +0000 UTC m=+796.445151437" lastFinishedPulling="2025-11-26 12:25:05.581779157 +0000 UTC m=+803.488992509" observedRunningTime="2025-11-26 12:25:11.471336811 +0000 UTC m=+809.378550162" watchObservedRunningTime="2025-11-26 12:25:11.476467055 +0000 UTC m=+809.383680407" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.486496 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=13.796909185 podStartE2EDuration="20.486459128s" podCreationTimestamp="2025-11-26 12:24:51 +0000 UTC" firstStartedPulling="2025-11-26 12:24:58.644995772 +0000 UTC m=+796.552209125" lastFinishedPulling="2025-11-26 12:25:05.334545715 +0000 UTC m=+803.241759068" observedRunningTime="2025-11-26 12:25:11.484789069 +0000 UTC m=+809.392002421" watchObservedRunningTime="2025-11-26 12:25:11.486459128 +0000 UTC m=+809.393672470" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.488407 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.506187 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-pgp2x"] Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.512689 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-pgp2x"] Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.516280 4834 scope.go:117] "RemoveContainer" containerID="55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce" Nov 26 12:25:11 crc kubenswrapper[4834]: E1126 12:25:11.516881 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce\": container with ID starting with 55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce not found: ID does not exist" containerID="55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.516953 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce"} err="failed to get container status \"55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce\": rpc error: code = NotFound desc = could not find container \"55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce\": container with ID starting with 55e2be38fe6bcfc1feced26dc41c06c0b74bf9a9084d23b15a555902c87466ce not found: ID does not exist" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.516978 4834 scope.go:117] "RemoveContainer" containerID="74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211" Nov 26 12:25:11 crc kubenswrapper[4834]: E1126 12:25:11.517258 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211\": container with ID starting with 74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211 not found: ID does not exist" containerID="74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.517293 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211"} err="failed to get container status \"74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211\": rpc error: code = NotFound desc = could not find container \"74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211\": container with ID starting with 74f299611da68e499c64a3f3b0825c89e23a7047d7f09e80faf866a36ad08211 not found: ID does not exist" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.517345 4834 scope.go:117] "RemoveContainer" containerID="eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.521676 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-cn5pw"] Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.529140 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-cn5pw"] Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.534373 4834 scope.go:117] "RemoveContainer" containerID="fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.559431 4834 scope.go:117] "RemoveContainer" containerID="eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e" Nov 26 12:25:11 crc kubenswrapper[4834]: E1126 12:25:11.559808 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e\": container with ID starting with eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e not found: ID does not exist" containerID="eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.559848 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e"} err="failed to get container status \"eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e\": rpc error: code = NotFound desc = could not find container \"eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e\": container with ID starting with eec92b7567ad99162b660e27f2db2adb9da4401487d6c9f43d0c4f962591345e not found: ID does not exist" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.559875 4834 scope.go:117] "RemoveContainer" containerID="fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250" Nov 26 12:25:11 crc kubenswrapper[4834]: E1126 12:25:11.560151 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250\": container with ID starting with fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250 not found: ID does not exist" containerID="fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.560177 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250"} err="failed to get container status \"fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250\": rpc error: code = NotFound desc = could not find container \"fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250\": container with ID starting with fb9a1e64dacffdcb93726a7387a8974e9e46829388b13c59702dfbf45394f250 not found: ID does not exist" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.573766 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-nxj5r"] Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.638001 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.674630 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.781169 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 26 12:25:11 crc kubenswrapper[4834]: I1126 12:25:11.781210 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.427597 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649f4660-441e-45a2-bdcd-ed292bf1d153" path="/var/lib/kubelet/pods/649f4660-441e-45a2-bdcd-ed292bf1d153/volumes" Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.428569 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2e8472a-0359-4818-b827-e901407fdcdf" path="/var/lib/kubelet/pods/f2e8472a-0359-4818-b827-e901407fdcdf/volumes" Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.452692 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.467983 4834 generic.go:334] "Generic (PLEG): container finished" podID="2fcb2f84-372f-4136-a20b-79a4388eda80" containerID="16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064" exitCode=0 Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.468050 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" event={"ID":"2fcb2f84-372f-4136-a20b-79a4388eda80","Type":"ContainerDied","Data":"16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064"} Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.468134 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" event={"ID":"2fcb2f84-372f-4136-a20b-79a4388eda80","Type":"ContainerStarted","Data":"0faf434d91f53c1660db6e91d6d525772de6811053c9c72985aea33ebb6841ad"} Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.469562 4834 generic.go:334] "Generic (PLEG): container finished" podID="53846057-6871-4d5b-95ec-b642c93f911c" containerID="cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132" exitCode=0 Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.469638 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" event={"ID":"53846057-6871-4d5b-95ec-b642c93f911c","Type":"ContainerDied","Data":"cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132"} Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.469674 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" event={"ID":"53846057-6871-4d5b-95ec-b642c93f911c","Type":"ContainerStarted","Data":"4351acab28b4d9b86208affbf0f021f822f1754b6958eb78be5a69c881852367"} Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.471723 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:12 crc kubenswrapper[4834]: I1126 12:25:12.497927 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 26 12:25:12 crc kubenswrapper[4834]: E1126 12:25:12.990528 4834 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.26.148:57460->192.168.26.148:32785: read tcp 192.168.26.148:57460->192.168.26.148:32785: read: connection reset by peer Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.087540 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.087650 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.482835 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" event={"ID":"2fcb2f84-372f-4136-a20b-79a4388eda80","Type":"ContainerStarted","Data":"bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15"} Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.482995 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.485949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" event={"ID":"53846057-6871-4d5b-95ec-b642c93f911c","Type":"ContainerStarted","Data":"828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9"} Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.503937 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" podStartSLOduration=3.503925742 podStartE2EDuration="3.503925742s" podCreationTimestamp="2025-11-26 12:25:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:25:13.498947483 +0000 UTC m=+811.406160835" watchObservedRunningTime="2025-11-26 12:25:13.503925742 +0000 UTC m=+811.411139094" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.517173 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.519771 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" podStartSLOduration=3.519750313 podStartE2EDuration="3.519750313s" podCreationTimestamp="2025-11-26 12:25:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:25:13.518694142 +0000 UTC m=+811.425907494" watchObservedRunningTime="2025-11-26 12:25:13.519750313 +0000 UTC m=+811.426963665" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.643721 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 26 12:25:13 crc kubenswrapper[4834]: E1126 12:25:13.644230 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e8472a-0359-4818-b827-e901407fdcdf" containerName="dnsmasq-dns" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.644296 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e8472a-0359-4818-b827-e901407fdcdf" containerName="dnsmasq-dns" Nov 26 12:25:13 crc kubenswrapper[4834]: E1126 12:25:13.644392 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e8472a-0359-4818-b827-e901407fdcdf" containerName="init" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.644463 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e8472a-0359-4818-b827-e901407fdcdf" containerName="init" Nov 26 12:25:13 crc kubenswrapper[4834]: E1126 12:25:13.644514 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649f4660-441e-45a2-bdcd-ed292bf1d153" containerName="dnsmasq-dns" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.644564 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="649f4660-441e-45a2-bdcd-ed292bf1d153" containerName="dnsmasq-dns" Nov 26 12:25:13 crc kubenswrapper[4834]: E1126 12:25:13.644618 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649f4660-441e-45a2-bdcd-ed292bf1d153" containerName="init" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.644659 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="649f4660-441e-45a2-bdcd-ed292bf1d153" containerName="init" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.644831 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e8472a-0359-4818-b827-e901407fdcdf" containerName="dnsmasq-dns" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.644894 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="649f4660-441e-45a2-bdcd-ed292bf1d153" containerName="dnsmasq-dns" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.645670 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.649075 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.649543 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.650983 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.653002 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-9qd46" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.654475 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.681703 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.718225 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbd7948b-27ac-4472-a49b-23eeac081fb9-scripts\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.718415 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbd7948b-27ac-4472-a49b-23eeac081fb9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.718458 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd7948b-27ac-4472-a49b-23eeac081fb9-config\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.718525 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd7948b-27ac-4472-a49b-23eeac081fb9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.718653 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd7948b-27ac-4472-a49b-23eeac081fb9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.718842 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd7948b-27ac-4472-a49b-23eeac081fb9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.718927 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv6mr\" (UniqueName: \"kubernetes.io/projected/dbd7948b-27ac-4472-a49b-23eeac081fb9-kube-api-access-xv6mr\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.822585 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd7948b-27ac-4472-a49b-23eeac081fb9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.824215 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv6mr\" (UniqueName: \"kubernetes.io/projected/dbd7948b-27ac-4472-a49b-23eeac081fb9-kube-api-access-xv6mr\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.824467 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbd7948b-27ac-4472-a49b-23eeac081fb9-scripts\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.824571 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbd7948b-27ac-4472-a49b-23eeac081fb9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.824641 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd7948b-27ac-4472-a49b-23eeac081fb9-config\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.824899 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd7948b-27ac-4472-a49b-23eeac081fb9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.824942 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd7948b-27ac-4472-a49b-23eeac081fb9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.825550 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbd7948b-27ac-4472-a49b-23eeac081fb9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.825570 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbd7948b-27ac-4472-a49b-23eeac081fb9-scripts\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.825716 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd7948b-27ac-4472-a49b-23eeac081fb9-config\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.832113 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd7948b-27ac-4472-a49b-23eeac081fb9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.832229 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd7948b-27ac-4472-a49b-23eeac081fb9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.835159 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd7948b-27ac-4472-a49b-23eeac081fb9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.845049 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv6mr\" (UniqueName: \"kubernetes.io/projected/dbd7948b-27ac-4472-a49b-23eeac081fb9-kube-api-access-xv6mr\") pod \"ovn-northd-0\" (UID: \"dbd7948b-27ac-4472-a49b-23eeac081fb9\") " pod="openstack/ovn-northd-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.845400 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.913008 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 26 12:25:13 crc kubenswrapper[4834]: I1126 12:25:13.964001 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 12:25:14 crc kubenswrapper[4834]: I1126 12:25:14.336585 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 12:25:14 crc kubenswrapper[4834]: I1126 12:25:14.343510 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 12:25:14 crc kubenswrapper[4834]: I1126 12:25:14.495147 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dbd7948b-27ac-4472-a49b-23eeac081fb9","Type":"ContainerStarted","Data":"1bb5c309db2fb2ceb8849677573936dbd2732213454cc046676a6bcb1162f279"} Nov 26 12:25:14 crc kubenswrapper[4834]: I1126 12:25:14.495189 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:15 crc kubenswrapper[4834]: I1126 12:25:15.604854 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 26 12:25:15 crc kubenswrapper[4834]: I1126 12:25:15.666204 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 26 12:25:15 crc kubenswrapper[4834]: I1126 12:25:15.853349 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 26 12:25:20 crc kubenswrapper[4834]: I1126 12:25:20.541376 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dbd7948b-27ac-4472-a49b-23eeac081fb9","Type":"ContainerStarted","Data":"a147b550b750bbac1f1007330da508ac82fa3a03fd21b96b24adf753246f8b8b"} Nov 26 12:25:20 crc kubenswrapper[4834]: I1126 12:25:20.541989 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 26 12:25:20 crc kubenswrapper[4834]: I1126 12:25:20.542002 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dbd7948b-27ac-4472-a49b-23eeac081fb9","Type":"ContainerStarted","Data":"eda9a84259c355f1896b496f46048be5d111d92825dc8a9c314896bd821fce84"} Nov 26 12:25:21 crc kubenswrapper[4834]: I1126 12:25:21.020442 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:21 crc kubenswrapper[4834]: I1126 12:25:21.036877 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.720893141 podStartE2EDuration="8.036813134s" podCreationTimestamp="2025-11-26 12:25:13 +0000 UTC" firstStartedPulling="2025-11-26 12:25:14.343205108 +0000 UTC m=+812.250418460" lastFinishedPulling="2025-11-26 12:25:19.659125101 +0000 UTC m=+817.566338453" observedRunningTime="2025-11-26 12:25:20.563625431 +0000 UTC m=+818.470838773" watchObservedRunningTime="2025-11-26 12:25:21.036813134 +0000 UTC m=+818.944026487" Nov 26 12:25:21 crc kubenswrapper[4834]: I1126 12:25:21.205457 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:21 crc kubenswrapper[4834]: I1126 12:25:21.255118 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-nnhrc"] Nov 26 12:25:21 crc kubenswrapper[4834]: I1126 12:25:21.548983 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" podUID="53846057-6871-4d5b-95ec-b642c93f911c" containerName="dnsmasq-dns" containerID="cri-o://828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9" gracePeriod=10 Nov 26 12:25:21 crc kubenswrapper[4834]: I1126 12:25:21.932787 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.070682 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b8gr\" (UniqueName: \"kubernetes.io/projected/53846057-6871-4d5b-95ec-b642c93f911c-kube-api-access-6b8gr\") pod \"53846057-6871-4d5b-95ec-b642c93f911c\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.070799 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-config\") pod \"53846057-6871-4d5b-95ec-b642c93f911c\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.070830 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-dns-svc\") pod \"53846057-6871-4d5b-95ec-b642c93f911c\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.071011 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-ovsdbserver-nb\") pod \"53846057-6871-4d5b-95ec-b642c93f911c\" (UID: \"53846057-6871-4d5b-95ec-b642c93f911c\") " Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.083991 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53846057-6871-4d5b-95ec-b642c93f911c-kube-api-access-6b8gr" (OuterVolumeSpecName: "kube-api-access-6b8gr") pod "53846057-6871-4d5b-95ec-b642c93f911c" (UID: "53846057-6871-4d5b-95ec-b642c93f911c"). InnerVolumeSpecName "kube-api-access-6b8gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.104175 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53846057-6871-4d5b-95ec-b642c93f911c" (UID: "53846057-6871-4d5b-95ec-b642c93f911c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.105417 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-config" (OuterVolumeSpecName: "config") pod "53846057-6871-4d5b-95ec-b642c93f911c" (UID: "53846057-6871-4d5b-95ec-b642c93f911c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.106236 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53846057-6871-4d5b-95ec-b642c93f911c" (UID: "53846057-6871-4d5b-95ec-b642c93f911c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.172915 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b8gr\" (UniqueName: \"kubernetes.io/projected/53846057-6871-4d5b-95ec-b642c93f911c-kube-api-access-6b8gr\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.172946 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.172957 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.172965 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53846057-6871-4d5b-95ec-b642c93f911c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.557616 4834 generic.go:334] "Generic (PLEG): container finished" podID="53846057-6871-4d5b-95ec-b642c93f911c" containerID="828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9" exitCode=0 Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.557727 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.557720 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" event={"ID":"53846057-6871-4d5b-95ec-b642c93f911c","Type":"ContainerDied","Data":"828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9"} Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.558086 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-nnhrc" event={"ID":"53846057-6871-4d5b-95ec-b642c93f911c","Type":"ContainerDied","Data":"4351acab28b4d9b86208affbf0f021f822f1754b6958eb78be5a69c881852367"} Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.558115 4834 scope.go:117] "RemoveContainer" containerID="828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.574909 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-nnhrc"] Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.577294 4834 scope.go:117] "RemoveContainer" containerID="cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.579023 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-nnhrc"] Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.591695 4834 scope.go:117] "RemoveContainer" containerID="828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9" Nov 26 12:25:22 crc kubenswrapper[4834]: E1126 12:25:22.591982 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9\": container with ID starting with 828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9 not found: ID does not exist" containerID="828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.592023 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9"} err="failed to get container status \"828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9\": rpc error: code = NotFound desc = could not find container \"828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9\": container with ID starting with 828640d999694c8d0b0a83d309bf384c6638418c98112cf25c3d1ba518dfd9a9 not found: ID does not exist" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.592053 4834 scope.go:117] "RemoveContainer" containerID="cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132" Nov 26 12:25:22 crc kubenswrapper[4834]: E1126 12:25:22.592299 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132\": container with ID starting with cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132 not found: ID does not exist" containerID="cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132" Nov 26 12:25:22 crc kubenswrapper[4834]: I1126 12:25:22.592409 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132"} err="failed to get container status \"cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132\": rpc error: code = NotFound desc = could not find container \"cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132\": container with ID starting with cb5d0a6e7969b0fb412084c9bf140eaa6b07a165a7df0d13b6eebddf528f1132 not found: ID does not exist" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.288716 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4da5-account-create-update-46kvg"] Nov 26 12:25:23 crc kubenswrapper[4834]: E1126 12:25:23.289022 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53846057-6871-4d5b-95ec-b642c93f911c" containerName="init" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.289040 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="53846057-6871-4d5b-95ec-b642c93f911c" containerName="init" Nov 26 12:25:23 crc kubenswrapper[4834]: E1126 12:25:23.289060 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53846057-6871-4d5b-95ec-b642c93f911c" containerName="dnsmasq-dns" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.289067 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="53846057-6871-4d5b-95ec-b642c93f911c" containerName="dnsmasq-dns" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.289218 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="53846057-6871-4d5b-95ec-b642c93f911c" containerName="dnsmasq-dns" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.289709 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.295846 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.298849 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4da5-account-create-update-46kvg"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.356270 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8lrs8"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.357402 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.363026 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8lrs8"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.395761 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a043f350-a471-483b-aa31-117d66b38cf5-operator-scripts\") pod \"keystone-4da5-account-create-update-46kvg\" (UID: \"a043f350-a471-483b-aa31-117d66b38cf5\") " pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.395904 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ql5p\" (UniqueName: \"kubernetes.io/projected/a043f350-a471-483b-aa31-117d66b38cf5-kube-api-access-7ql5p\") pod \"keystone-4da5-account-create-update-46kvg\" (UID: \"a043f350-a471-483b-aa31-117d66b38cf5\") " pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.497159 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80b41954-7fc3-4e18-9fde-08323d1a5aa6-operator-scripts\") pod \"keystone-db-create-8lrs8\" (UID: \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\") " pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.497424 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a043f350-a471-483b-aa31-117d66b38cf5-operator-scripts\") pod \"keystone-4da5-account-create-update-46kvg\" (UID: \"a043f350-a471-483b-aa31-117d66b38cf5\") " pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.497524 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqpr\" (UniqueName: \"kubernetes.io/projected/80b41954-7fc3-4e18-9fde-08323d1a5aa6-kube-api-access-cpqpr\") pod \"keystone-db-create-8lrs8\" (UID: \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\") " pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.497598 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ql5p\" (UniqueName: \"kubernetes.io/projected/a043f350-a471-483b-aa31-117d66b38cf5-kube-api-access-7ql5p\") pod \"keystone-4da5-account-create-update-46kvg\" (UID: \"a043f350-a471-483b-aa31-117d66b38cf5\") " pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.498157 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a043f350-a471-483b-aa31-117d66b38cf5-operator-scripts\") pod \"keystone-4da5-account-create-update-46kvg\" (UID: \"a043f350-a471-483b-aa31-117d66b38cf5\") " pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.515139 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ql5p\" (UniqueName: \"kubernetes.io/projected/a043f350-a471-483b-aa31-117d66b38cf5-kube-api-access-7ql5p\") pod \"keystone-4da5-account-create-update-46kvg\" (UID: \"a043f350-a471-483b-aa31-117d66b38cf5\") " pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.565382 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rzjvj"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.567050 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.583115 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rzjvj"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.598706 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqpr\" (UniqueName: \"kubernetes.io/projected/80b41954-7fc3-4e18-9fde-08323d1a5aa6-kube-api-access-cpqpr\") pod \"keystone-db-create-8lrs8\" (UID: \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\") " pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.598808 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80b41954-7fc3-4e18-9fde-08323d1a5aa6-operator-scripts\") pod \"keystone-db-create-8lrs8\" (UID: \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\") " pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.599435 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80b41954-7fc3-4e18-9fde-08323d1a5aa6-operator-scripts\") pod \"keystone-db-create-8lrs8\" (UID: \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\") " pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.608897 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.613788 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqpr\" (UniqueName: \"kubernetes.io/projected/80b41954-7fc3-4e18-9fde-08323d1a5aa6-kube-api-access-cpqpr\") pod \"keystone-db-create-8lrs8\" (UID: \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\") " pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.670005 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7fa5-account-create-update-t2rx7"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.677136 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.680009 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.680249 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fa5-account-create-update-t2rx7"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.680800 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.701199 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l57r4\" (UniqueName: \"kubernetes.io/projected/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-kube-api-access-l57r4\") pod \"placement-db-create-rzjvj\" (UID: \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\") " pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.701648 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-operator-scripts\") pod \"placement-db-create-rzjvj\" (UID: \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\") " pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.796241 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-wm9lb"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.797551 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.803423 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxmgr\" (UniqueName: \"kubernetes.io/projected/0e8838f6-8981-4f7d-a871-f09435bfc1ee-kube-api-access-sxmgr\") pod \"placement-7fa5-account-create-update-t2rx7\" (UID: \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\") " pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.803501 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e8838f6-8981-4f7d-a871-f09435bfc1ee-operator-scripts\") pod \"placement-7fa5-account-create-update-t2rx7\" (UID: \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\") " pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.803719 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l57r4\" (UniqueName: \"kubernetes.io/projected/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-kube-api-access-l57r4\") pod \"placement-db-create-rzjvj\" (UID: \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\") " pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.803779 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-operator-scripts\") pod \"placement-db-create-rzjvj\" (UID: \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\") " pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.804517 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-operator-scripts\") pod \"placement-db-create-rzjvj\" (UID: \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\") " pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.805233 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wm9lb"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.827099 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l57r4\" (UniqueName: \"kubernetes.io/projected/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-kube-api-access-l57r4\") pod \"placement-db-create-rzjvj\" (UID: \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\") " pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.860930 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c888-account-create-update-txj78"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.861798 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.869332 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.870581 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c888-account-create-update-txj78"] Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.889402 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.905840 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2drv\" (UniqueName: \"kubernetes.io/projected/befcee03-7239-4d9e-915d-7fa7f9ebfb44-kube-api-access-t2drv\") pod \"glance-db-create-wm9lb\" (UID: \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\") " pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.905922 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxmgr\" (UniqueName: \"kubernetes.io/projected/0e8838f6-8981-4f7d-a871-f09435bfc1ee-kube-api-access-sxmgr\") pod \"placement-7fa5-account-create-update-t2rx7\" (UID: \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\") " pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.906039 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h79z9\" (UniqueName: \"kubernetes.io/projected/68884564-b059-4562-8410-956a50be744a-kube-api-access-h79z9\") pod \"glance-c888-account-create-update-txj78\" (UID: \"68884564-b059-4562-8410-956a50be744a\") " pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.906126 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e8838f6-8981-4f7d-a871-f09435bfc1ee-operator-scripts\") pod \"placement-7fa5-account-create-update-t2rx7\" (UID: \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\") " pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.906329 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68884564-b059-4562-8410-956a50be744a-operator-scripts\") pod \"glance-c888-account-create-update-txj78\" (UID: \"68884564-b059-4562-8410-956a50be744a\") " pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.906546 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befcee03-7239-4d9e-915d-7fa7f9ebfb44-operator-scripts\") pod \"glance-db-create-wm9lb\" (UID: \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\") " pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.906854 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e8838f6-8981-4f7d-a871-f09435bfc1ee-operator-scripts\") pod \"placement-7fa5-account-create-update-t2rx7\" (UID: \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\") " pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:23 crc kubenswrapper[4834]: I1126 12:25:23.920339 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxmgr\" (UniqueName: \"kubernetes.io/projected/0e8838f6-8981-4f7d-a871-f09435bfc1ee-kube-api-access-sxmgr\") pod \"placement-7fa5-account-create-update-t2rx7\" (UID: \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\") " pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.008597 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h79z9\" (UniqueName: \"kubernetes.io/projected/68884564-b059-4562-8410-956a50be744a-kube-api-access-h79z9\") pod \"glance-c888-account-create-update-txj78\" (UID: \"68884564-b059-4562-8410-956a50be744a\") " pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.008685 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68884564-b059-4562-8410-956a50be744a-operator-scripts\") pod \"glance-c888-account-create-update-txj78\" (UID: \"68884564-b059-4562-8410-956a50be744a\") " pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.008751 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befcee03-7239-4d9e-915d-7fa7f9ebfb44-operator-scripts\") pod \"glance-db-create-wm9lb\" (UID: \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\") " pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.008840 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2drv\" (UniqueName: \"kubernetes.io/projected/befcee03-7239-4d9e-915d-7fa7f9ebfb44-kube-api-access-t2drv\") pod \"glance-db-create-wm9lb\" (UID: \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\") " pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.009565 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.010404 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68884564-b059-4562-8410-956a50be744a-operator-scripts\") pod \"glance-c888-account-create-update-txj78\" (UID: \"68884564-b059-4562-8410-956a50be744a\") " pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.010705 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befcee03-7239-4d9e-915d-7fa7f9ebfb44-operator-scripts\") pod \"glance-db-create-wm9lb\" (UID: \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\") " pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.021959 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2drv\" (UniqueName: \"kubernetes.io/projected/befcee03-7239-4d9e-915d-7fa7f9ebfb44-kube-api-access-t2drv\") pod \"glance-db-create-wm9lb\" (UID: \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\") " pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.022998 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h79z9\" (UniqueName: \"kubernetes.io/projected/68884564-b059-4562-8410-956a50be744a-kube-api-access-h79z9\") pod \"glance-c888-account-create-update-txj78\" (UID: \"68884564-b059-4562-8410-956a50be744a\") " pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.055786 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4da5-account-create-update-46kvg"] Nov 26 12:25:24 crc kubenswrapper[4834]: W1126 12:25:24.056515 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda043f350_a471_483b_aa31_117d66b38cf5.slice/crio-c0b9bfa0169c0070dfb285b44d1e661e93f3f3511bf2b08114aa0007936ee376 WatchSource:0}: Error finding container c0b9bfa0169c0070dfb285b44d1e661e93f3f3511bf2b08114aa0007936ee376: Status 404 returned error can't find the container with id c0b9bfa0169c0070dfb285b44d1e661e93f3f3511bf2b08114aa0007936ee376 Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.122699 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.144171 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8lrs8"] Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.177853 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.276521 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rzjvj"] Nov 26 12:25:24 crc kubenswrapper[4834]: W1126 12:25:24.285064 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f18cba3_901e_45d3_9f1a_a04a17fe1b4d.slice/crio-11d2da62088c651b9018284cbef1e99efd5b3be38ba0e40779a37ccf22c4de3b WatchSource:0}: Error finding container 11d2da62088c651b9018284cbef1e99efd5b3be38ba0e40779a37ccf22c4de3b: Status 404 returned error can't find the container with id 11d2da62088c651b9018284cbef1e99efd5b3be38ba0e40779a37ccf22c4de3b Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.395593 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fa5-account-create-update-t2rx7"] Nov 26 12:25:24 crc kubenswrapper[4834]: W1126 12:25:24.411564 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e8838f6_8981_4f7d_a871_f09435bfc1ee.slice/crio-5848fd3396932ec337710062fb8b084fcb1fa6560e97d396f6a9ff0456315307 WatchSource:0}: Error finding container 5848fd3396932ec337710062fb8b084fcb1fa6560e97d396f6a9ff0456315307: Status 404 returned error can't find the container with id 5848fd3396932ec337710062fb8b084fcb1fa6560e97d396f6a9ff0456315307 Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.427725 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53846057-6871-4d5b-95ec-b642c93f911c" path="/var/lib/kubelet/pods/53846057-6871-4d5b-95ec-b642c93f911c/volumes" Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.560635 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wm9lb"] Nov 26 12:25:24 crc kubenswrapper[4834]: W1126 12:25:24.564893 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbefcee03_7239_4d9e_915d_7fa7f9ebfb44.slice/crio-de5d4ea76225c354d096a7d0d4f250f435440bad5689c3f26c19e4afbc003f76 WatchSource:0}: Error finding container de5d4ea76225c354d096a7d0d4f250f435440bad5689c3f26c19e4afbc003f76: Status 404 returned error can't find the container with id de5d4ea76225c354d096a7d0d4f250f435440bad5689c3f26c19e4afbc003f76 Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.594514 4834 generic.go:334] "Generic (PLEG): container finished" podID="80b41954-7fc3-4e18-9fde-08323d1a5aa6" containerID="16f70ec28c0dc21712e2ff1513c0b545325790bc8041d8b96fc08c491861e475" exitCode=0 Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.594592 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8lrs8" event={"ID":"80b41954-7fc3-4e18-9fde-08323d1a5aa6","Type":"ContainerDied","Data":"16f70ec28c0dc21712e2ff1513c0b545325790bc8041d8b96fc08c491861e475"} Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.594629 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8lrs8" event={"ID":"80b41954-7fc3-4e18-9fde-08323d1a5aa6","Type":"ContainerStarted","Data":"834343ddf5ef4552ae1f3e02d7afb2e7b980d3b5ecc1faf75b6a694d6e52042a"} Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.597833 4834 generic.go:334] "Generic (PLEG): container finished" podID="a043f350-a471-483b-aa31-117d66b38cf5" containerID="6f6cfe17b022b47b23e8855789246d181a4f11dbd89b7f79281fc8c0769219ed" exitCode=0 Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.597927 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4da5-account-create-update-46kvg" event={"ID":"a043f350-a471-483b-aa31-117d66b38cf5","Type":"ContainerDied","Data":"6f6cfe17b022b47b23e8855789246d181a4f11dbd89b7f79281fc8c0769219ed"} Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.597958 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4da5-account-create-update-46kvg" event={"ID":"a043f350-a471-483b-aa31-117d66b38cf5","Type":"ContainerStarted","Data":"c0b9bfa0169c0070dfb285b44d1e661e93f3f3511bf2b08114aa0007936ee376"} Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.600708 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fa5-account-create-update-t2rx7" event={"ID":"0e8838f6-8981-4f7d-a871-f09435bfc1ee","Type":"ContainerStarted","Data":"247a86de535a490d43352f9c5b84cb2fc502769f25f3d8927c150f8d508d4953"} Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.600750 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fa5-account-create-update-t2rx7" event={"ID":"0e8838f6-8981-4f7d-a871-f09435bfc1ee","Type":"ContainerStarted","Data":"5848fd3396932ec337710062fb8b084fcb1fa6560e97d396f6a9ff0456315307"} Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.603733 4834 generic.go:334] "Generic (PLEG): container finished" podID="0f18cba3-901e-45d3-9f1a-a04a17fe1b4d" containerID="37f4d14d3e94a1f32e6c5e55c04ebd2f89f021bc9738852546082a8228e3df1c" exitCode=0 Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.603815 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rzjvj" event={"ID":"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d","Type":"ContainerDied","Data":"37f4d14d3e94a1f32e6c5e55c04ebd2f89f021bc9738852546082a8228e3df1c"} Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.603843 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rzjvj" event={"ID":"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d","Type":"ContainerStarted","Data":"11d2da62088c651b9018284cbef1e99efd5b3be38ba0e40779a37ccf22c4de3b"} Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.607224 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wm9lb" event={"ID":"befcee03-7239-4d9e-915d-7fa7f9ebfb44","Type":"ContainerStarted","Data":"de5d4ea76225c354d096a7d0d4f250f435440bad5689c3f26c19e4afbc003f76"} Nov 26 12:25:24 crc kubenswrapper[4834]: W1126 12:25:24.652058 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68884564_b059_4562_8410_956a50be744a.slice/crio-390fd442735d67e7556a62f7804344fc63555459b3d567e2e818b182238d66a6 WatchSource:0}: Error finding container 390fd442735d67e7556a62f7804344fc63555459b3d567e2e818b182238d66a6: Status 404 returned error can't find the container with id 390fd442735d67e7556a62f7804344fc63555459b3d567e2e818b182238d66a6 Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.652451 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c888-account-create-update-txj78"] Nov 26 12:25:24 crc kubenswrapper[4834]: I1126 12:25:24.655765 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7fa5-account-create-update-t2rx7" podStartSLOduration=1.6557452590000001 podStartE2EDuration="1.655745259s" podCreationTimestamp="2025-11-26 12:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:25:24.648278698 +0000 UTC m=+822.555492051" watchObservedRunningTime="2025-11-26 12:25:24.655745259 +0000 UTC m=+822.562958611" Nov 26 12:25:25 crc kubenswrapper[4834]: I1126 12:25:25.616623 4834 generic.go:334] "Generic (PLEG): container finished" podID="befcee03-7239-4d9e-915d-7fa7f9ebfb44" containerID="d901867de9f0e91087afc5ead70be4e5c65daae684033fa9aeb243f352949716" exitCode=0 Nov 26 12:25:25 crc kubenswrapper[4834]: I1126 12:25:25.616678 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wm9lb" event={"ID":"befcee03-7239-4d9e-915d-7fa7f9ebfb44","Type":"ContainerDied","Data":"d901867de9f0e91087afc5ead70be4e5c65daae684033fa9aeb243f352949716"} Nov 26 12:25:25 crc kubenswrapper[4834]: I1126 12:25:25.618989 4834 generic.go:334] "Generic (PLEG): container finished" podID="68884564-b059-4562-8410-956a50be744a" containerID="feb7821555cf800b22fd3e1c2858ed939d9778299f04c6d7d31105857f017a3d" exitCode=0 Nov 26 12:25:25 crc kubenswrapper[4834]: I1126 12:25:25.619030 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c888-account-create-update-txj78" event={"ID":"68884564-b059-4562-8410-956a50be744a","Type":"ContainerDied","Data":"feb7821555cf800b22fd3e1c2858ed939d9778299f04c6d7d31105857f017a3d"} Nov 26 12:25:25 crc kubenswrapper[4834]: I1126 12:25:25.619155 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c888-account-create-update-txj78" event={"ID":"68884564-b059-4562-8410-956a50be744a","Type":"ContainerStarted","Data":"390fd442735d67e7556a62f7804344fc63555459b3d567e2e818b182238d66a6"} Nov 26 12:25:25 crc kubenswrapper[4834]: I1126 12:25:25.620705 4834 generic.go:334] "Generic (PLEG): container finished" podID="0e8838f6-8981-4f7d-a871-f09435bfc1ee" containerID="247a86de535a490d43352f9c5b84cb2fc502769f25f3d8927c150f8d508d4953" exitCode=0 Nov 26 12:25:25 crc kubenswrapper[4834]: I1126 12:25:25.620798 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fa5-account-create-update-t2rx7" event={"ID":"0e8838f6-8981-4f7d-a871-f09435bfc1ee","Type":"ContainerDied","Data":"247a86de535a490d43352f9c5b84cb2fc502769f25f3d8927c150f8d508d4953"} Nov 26 12:25:25 crc kubenswrapper[4834]: I1126 12:25:25.926511 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.019301 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.024170 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.047112 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpqpr\" (UniqueName: \"kubernetes.io/projected/80b41954-7fc3-4e18-9fde-08323d1a5aa6-kube-api-access-cpqpr\") pod \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\" (UID: \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.047336 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a043f350-a471-483b-aa31-117d66b38cf5-operator-scripts\") pod \"a043f350-a471-483b-aa31-117d66b38cf5\" (UID: \"a043f350-a471-483b-aa31-117d66b38cf5\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.047389 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l57r4\" (UniqueName: \"kubernetes.io/projected/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-kube-api-access-l57r4\") pod \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\" (UID: \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.047431 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80b41954-7fc3-4e18-9fde-08323d1a5aa6-operator-scripts\") pod \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\" (UID: \"80b41954-7fc3-4e18-9fde-08323d1a5aa6\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.047472 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ql5p\" (UniqueName: \"kubernetes.io/projected/a043f350-a471-483b-aa31-117d66b38cf5-kube-api-access-7ql5p\") pod \"a043f350-a471-483b-aa31-117d66b38cf5\" (UID: \"a043f350-a471-483b-aa31-117d66b38cf5\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.047600 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-operator-scripts\") pod \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\" (UID: \"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.048638 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80b41954-7fc3-4e18-9fde-08323d1a5aa6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80b41954-7fc3-4e18-9fde-08323d1a5aa6" (UID: "80b41954-7fc3-4e18-9fde-08323d1a5aa6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.048816 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a043f350-a471-483b-aa31-117d66b38cf5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a043f350-a471-483b-aa31-117d66b38cf5" (UID: "a043f350-a471-483b-aa31-117d66b38cf5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.049287 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f18cba3-901e-45d3-9f1a-a04a17fe1b4d" (UID: "0f18cba3-901e-45d3-9f1a-a04a17fe1b4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.049606 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80b41954-7fc3-4e18-9fde-08323d1a5aa6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.049632 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.049663 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a043f350-a471-483b-aa31-117d66b38cf5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.055441 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b41954-7fc3-4e18-9fde-08323d1a5aa6-kube-api-access-cpqpr" (OuterVolumeSpecName: "kube-api-access-cpqpr") pod "80b41954-7fc3-4e18-9fde-08323d1a5aa6" (UID: "80b41954-7fc3-4e18-9fde-08323d1a5aa6"). InnerVolumeSpecName "kube-api-access-cpqpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.055609 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-kube-api-access-l57r4" (OuterVolumeSpecName: "kube-api-access-l57r4") pod "0f18cba3-901e-45d3-9f1a-a04a17fe1b4d" (UID: "0f18cba3-901e-45d3-9f1a-a04a17fe1b4d"). InnerVolumeSpecName "kube-api-access-l57r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.056142 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a043f350-a471-483b-aa31-117d66b38cf5-kube-api-access-7ql5p" (OuterVolumeSpecName: "kube-api-access-7ql5p") pod "a043f350-a471-483b-aa31-117d66b38cf5" (UID: "a043f350-a471-483b-aa31-117d66b38cf5"). InnerVolumeSpecName "kube-api-access-7ql5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.153738 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpqpr\" (UniqueName: \"kubernetes.io/projected/80b41954-7fc3-4e18-9fde-08323d1a5aa6-kube-api-access-cpqpr\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.153789 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l57r4\" (UniqueName: \"kubernetes.io/projected/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d-kube-api-access-l57r4\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.153800 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ql5p\" (UniqueName: \"kubernetes.io/projected/a043f350-a471-483b-aa31-117d66b38cf5-kube-api-access-7ql5p\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.628913 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4da5-account-create-update-46kvg" event={"ID":"a043f350-a471-483b-aa31-117d66b38cf5","Type":"ContainerDied","Data":"c0b9bfa0169c0070dfb285b44d1e661e93f3f3511bf2b08114aa0007936ee376"} Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.628953 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4da5-account-create-update-46kvg" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.628959 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0b9bfa0169c0070dfb285b44d1e661e93f3f3511bf2b08114aa0007936ee376" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.630281 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzjvj" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.630333 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rzjvj" event={"ID":"0f18cba3-901e-45d3-9f1a-a04a17fe1b4d","Type":"ContainerDied","Data":"11d2da62088c651b9018284cbef1e99efd5b3be38ba0e40779a37ccf22c4de3b"} Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.630355 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11d2da62088c651b9018284cbef1e99efd5b3be38ba0e40779a37ccf22c4de3b" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.631626 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8lrs8" event={"ID":"80b41954-7fc3-4e18-9fde-08323d1a5aa6","Type":"ContainerDied","Data":"834343ddf5ef4552ae1f3e02d7afb2e7b980d3b5ecc1faf75b6a694d6e52042a"} Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.631663 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="834343ddf5ef4552ae1f3e02d7afb2e7b980d3b5ecc1faf75b6a694d6e52042a" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.631670 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8lrs8" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.901289 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.932596 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.945529 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.966346 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h79z9\" (UniqueName: \"kubernetes.io/projected/68884564-b059-4562-8410-956a50be744a-kube-api-access-h79z9\") pod \"68884564-b059-4562-8410-956a50be744a\" (UID: \"68884564-b059-4562-8410-956a50be744a\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.966507 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e8838f6-8981-4f7d-a871-f09435bfc1ee-operator-scripts\") pod \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\" (UID: \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.966584 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxmgr\" (UniqueName: \"kubernetes.io/projected/0e8838f6-8981-4f7d-a871-f09435bfc1ee-kube-api-access-sxmgr\") pod \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\" (UID: \"0e8838f6-8981-4f7d-a871-f09435bfc1ee\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.967068 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68884564-b059-4562-8410-956a50be744a-operator-scripts\") pod \"68884564-b059-4562-8410-956a50be744a\" (UID: \"68884564-b059-4562-8410-956a50be744a\") " Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.967065 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e8838f6-8981-4f7d-a871-f09435bfc1ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e8838f6-8981-4f7d-a871-f09435bfc1ee" (UID: "0e8838f6-8981-4f7d-a871-f09435bfc1ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.967570 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68884564-b059-4562-8410-956a50be744a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68884564-b059-4562-8410-956a50be744a" (UID: "68884564-b059-4562-8410-956a50be744a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.967585 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e8838f6-8981-4f7d-a871-f09435bfc1ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.969196 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68884564-b059-4562-8410-956a50be744a-kube-api-access-h79z9" (OuterVolumeSpecName: "kube-api-access-h79z9") pod "68884564-b059-4562-8410-956a50be744a" (UID: "68884564-b059-4562-8410-956a50be744a"). InnerVolumeSpecName "kube-api-access-h79z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:26 crc kubenswrapper[4834]: I1126 12:25:26.969335 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8838f6-8981-4f7d-a871-f09435bfc1ee-kube-api-access-sxmgr" (OuterVolumeSpecName: "kube-api-access-sxmgr") pod "0e8838f6-8981-4f7d-a871-f09435bfc1ee" (UID: "0e8838f6-8981-4f7d-a871-f09435bfc1ee"). InnerVolumeSpecName "kube-api-access-sxmgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.068886 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befcee03-7239-4d9e-915d-7fa7f9ebfb44-operator-scripts\") pod \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\" (UID: \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\") " Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.068994 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2drv\" (UniqueName: \"kubernetes.io/projected/befcee03-7239-4d9e-915d-7fa7f9ebfb44-kube-api-access-t2drv\") pod \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\" (UID: \"befcee03-7239-4d9e-915d-7fa7f9ebfb44\") " Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.069542 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h79z9\" (UniqueName: \"kubernetes.io/projected/68884564-b059-4562-8410-956a50be744a-kube-api-access-h79z9\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.069554 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxmgr\" (UniqueName: \"kubernetes.io/projected/0e8838f6-8981-4f7d-a871-f09435bfc1ee-kube-api-access-sxmgr\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.069564 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68884564-b059-4562-8410-956a50be744a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.070481 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/befcee03-7239-4d9e-915d-7fa7f9ebfb44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "befcee03-7239-4d9e-915d-7fa7f9ebfb44" (UID: "befcee03-7239-4d9e-915d-7fa7f9ebfb44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.073584 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/befcee03-7239-4d9e-915d-7fa7f9ebfb44-kube-api-access-t2drv" (OuterVolumeSpecName: "kube-api-access-t2drv") pod "befcee03-7239-4d9e-915d-7fa7f9ebfb44" (UID: "befcee03-7239-4d9e-915d-7fa7f9ebfb44"). InnerVolumeSpecName "kube-api-access-t2drv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.170753 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befcee03-7239-4d9e-915d-7fa7f9ebfb44-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.170785 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2drv\" (UniqueName: \"kubernetes.io/projected/befcee03-7239-4d9e-915d-7fa7f9ebfb44-kube-api-access-t2drv\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.641986 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fa5-account-create-update-t2rx7" event={"ID":"0e8838f6-8981-4f7d-a871-f09435bfc1ee","Type":"ContainerDied","Data":"5848fd3396932ec337710062fb8b084fcb1fa6560e97d396f6a9ff0456315307"} Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.642035 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5848fd3396932ec337710062fb8b084fcb1fa6560e97d396f6a9ff0456315307" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.642096 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fa5-account-create-update-t2rx7" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.644183 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wm9lb" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.644193 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wm9lb" event={"ID":"befcee03-7239-4d9e-915d-7fa7f9ebfb44","Type":"ContainerDied","Data":"de5d4ea76225c354d096a7d0d4f250f435440bad5689c3f26c19e4afbc003f76"} Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.644241 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de5d4ea76225c354d096a7d0d4f250f435440bad5689c3f26c19e4afbc003f76" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.646962 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c888-account-create-update-txj78" event={"ID":"68884564-b059-4562-8410-956a50be744a","Type":"ContainerDied","Data":"390fd442735d67e7556a62f7804344fc63555459b3d567e2e818b182238d66a6"} Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.647002 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="390fd442735d67e7556a62f7804344fc63555459b3d567e2e818b182238d66a6" Nov 26 12:25:27 crc kubenswrapper[4834]: I1126 12:25:27.647012 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c888-account-create-update-txj78" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019443 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-t2gkm"] Nov 26 12:25:29 crc kubenswrapper[4834]: E1126 12:25:29.019735 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befcee03-7239-4d9e-915d-7fa7f9ebfb44" containerName="mariadb-database-create" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019747 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="befcee03-7239-4d9e-915d-7fa7f9ebfb44" containerName="mariadb-database-create" Nov 26 12:25:29 crc kubenswrapper[4834]: E1126 12:25:29.019765 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f18cba3-901e-45d3-9f1a-a04a17fe1b4d" containerName="mariadb-database-create" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019772 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f18cba3-901e-45d3-9f1a-a04a17fe1b4d" containerName="mariadb-database-create" Nov 26 12:25:29 crc kubenswrapper[4834]: E1126 12:25:29.019783 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8838f6-8981-4f7d-a871-f09435bfc1ee" containerName="mariadb-account-create-update" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019790 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8838f6-8981-4f7d-a871-f09435bfc1ee" containerName="mariadb-account-create-update" Nov 26 12:25:29 crc kubenswrapper[4834]: E1126 12:25:29.019807 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b41954-7fc3-4e18-9fde-08323d1a5aa6" containerName="mariadb-database-create" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019812 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b41954-7fc3-4e18-9fde-08323d1a5aa6" containerName="mariadb-database-create" Nov 26 12:25:29 crc kubenswrapper[4834]: E1126 12:25:29.019822 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a043f350-a471-483b-aa31-117d66b38cf5" containerName="mariadb-account-create-update" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019827 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a043f350-a471-483b-aa31-117d66b38cf5" containerName="mariadb-account-create-update" Nov 26 12:25:29 crc kubenswrapper[4834]: E1126 12:25:29.019837 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68884564-b059-4562-8410-956a50be744a" containerName="mariadb-account-create-update" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019842 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="68884564-b059-4562-8410-956a50be744a" containerName="mariadb-account-create-update" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019971 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="68884564-b059-4562-8410-956a50be744a" containerName="mariadb-account-create-update" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019985 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a043f350-a471-483b-aa31-117d66b38cf5" containerName="mariadb-account-create-update" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.019996 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="befcee03-7239-4d9e-915d-7fa7f9ebfb44" containerName="mariadb-database-create" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.020006 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b41954-7fc3-4e18-9fde-08323d1a5aa6" containerName="mariadb-database-create" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.020015 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f18cba3-901e-45d3-9f1a-a04a17fe1b4d" containerName="mariadb-database-create" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.020023 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8838f6-8981-4f7d-a871-f09435bfc1ee" containerName="mariadb-account-create-update" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.020495 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.022203 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9wfxh" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.022404 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.025809 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-t2gkm"] Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.101601 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-combined-ca-bundle\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.101664 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdj2\" (UniqueName: \"kubernetes.io/projected/6d56ce8c-1412-4272-8905-e251251f4f64-kube-api-access-2qdj2\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.101954 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-db-sync-config-data\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.101997 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-config-data\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.203948 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-combined-ca-bundle\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.204003 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdj2\" (UniqueName: \"kubernetes.io/projected/6d56ce8c-1412-4272-8905-e251251f4f64-kube-api-access-2qdj2\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.204092 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-db-sync-config-data\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.204117 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-config-data\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.208941 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-db-sync-config-data\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.209022 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-config-data\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.210143 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-combined-ca-bundle\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.218452 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdj2\" (UniqueName: \"kubernetes.io/projected/6d56ce8c-1412-4272-8905-e251251f4f64-kube-api-access-2qdj2\") pod \"glance-db-sync-t2gkm\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.333128 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:29 crc kubenswrapper[4834]: I1126 12:25:29.774818 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-t2gkm"] Nov 26 12:25:29 crc kubenswrapper[4834]: W1126 12:25:29.777741 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d56ce8c_1412_4272_8905_e251251f4f64.slice/crio-da33997228dbbd0a5e8a9791ab5c6947cf354a96150e00baa76e91feea936380 WatchSource:0}: Error finding container da33997228dbbd0a5e8a9791ab5c6947cf354a96150e00baa76e91feea936380: Status 404 returned error can't find the container with id da33997228dbbd0a5e8a9791ab5c6947cf354a96150e00baa76e91feea936380 Nov 26 12:25:30 crc kubenswrapper[4834]: I1126 12:25:30.671524 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t2gkm" event={"ID":"6d56ce8c-1412-4272-8905-e251251f4f64","Type":"ContainerStarted","Data":"da33997228dbbd0a5e8a9791ab5c6947cf354a96150e00baa76e91feea936380"} Nov 26 12:25:34 crc kubenswrapper[4834]: I1126 12:25:34.012279 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 26 12:25:38 crc kubenswrapper[4834]: I1126 12:25:38.745052 4834 generic.go:334] "Generic (PLEG): container finished" podID="3f30c5fe-7895-474e-a94d-967b23650025" containerID="f2360140e760e2b0540dfc06873720e62ea3d3673fe4b602b6263cef14fad59a" exitCode=0 Nov 26 12:25:38 crc kubenswrapper[4834]: I1126 12:25:38.745144 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f30c5fe-7895-474e-a94d-967b23650025","Type":"ContainerDied","Data":"f2360140e760e2b0540dfc06873720e62ea3d3673fe4b602b6263cef14fad59a"} Nov 26 12:25:38 crc kubenswrapper[4834]: I1126 12:25:38.747111 4834 generic.go:334] "Generic (PLEG): container finished" podID="25058ace-b8d8-4ac1-924a-844f3955f0fb" containerID="2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c" exitCode=0 Nov 26 12:25:38 crc kubenswrapper[4834]: I1126 12:25:38.747165 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"25058ace-b8d8-4ac1-924a-844f3955f0fb","Type":"ContainerDied","Data":"2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c"} Nov 26 12:25:38 crc kubenswrapper[4834]: I1126 12:25:38.965902 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-n6tm2" podUID="b0ec582c-ead4-4350-ac7f-530f80804717" containerName="ovn-controller" probeResult="failure" output=< Nov 26 12:25:38 crc kubenswrapper[4834]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 26 12:25:38 crc kubenswrapper[4834]: > Nov 26 12:25:39 crc kubenswrapper[4834]: I1126 12:25:39.046231 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:25:40 crc kubenswrapper[4834]: I1126 12:25:40.767546 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f30c5fe-7895-474e-a94d-967b23650025","Type":"ContainerStarted","Data":"74d0607133594134910f3e5a2b2b986d1d3c7f8763f17093a89e8ac30861a21b"} Nov 26 12:25:40 crc kubenswrapper[4834]: I1126 12:25:40.768563 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 26 12:25:40 crc kubenswrapper[4834]: I1126 12:25:40.770601 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"25058ace-b8d8-4ac1-924a-844f3955f0fb","Type":"ContainerStarted","Data":"066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f"} Nov 26 12:25:40 crc kubenswrapper[4834]: I1126 12:25:40.770806 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:25:40 crc kubenswrapper[4834]: I1126 12:25:40.772824 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t2gkm" event={"ID":"6d56ce8c-1412-4272-8905-e251251f4f64","Type":"ContainerStarted","Data":"b0909226f01908f9f2ded595fced81eb1c80fe80c1f5e098da1d11534c07302e"} Nov 26 12:25:40 crc kubenswrapper[4834]: I1126 12:25:40.796479 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.372543756 podStartE2EDuration="52.796462633s" podCreationTimestamp="2025-11-26 12:24:48 +0000 UTC" firstStartedPulling="2025-11-26 12:24:57.979164594 +0000 UTC m=+795.886377946" lastFinishedPulling="2025-11-26 12:25:04.403083472 +0000 UTC m=+802.310296823" observedRunningTime="2025-11-26 12:25:40.789470067 +0000 UTC m=+838.696683419" watchObservedRunningTime="2025-11-26 12:25:40.796462633 +0000 UTC m=+838.703675984" Nov 26 12:25:40 crc kubenswrapper[4834]: I1126 12:25:40.804721 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-t2gkm" podStartSLOduration=1.979094844 podStartE2EDuration="11.804707351s" podCreationTimestamp="2025-11-26 12:25:29 +0000 UTC" firstStartedPulling="2025-11-26 12:25:29.779749057 +0000 UTC m=+827.686962409" lastFinishedPulling="2025-11-26 12:25:39.605361564 +0000 UTC m=+837.512574916" observedRunningTime="2025-11-26 12:25:40.804108992 +0000 UTC m=+838.711322344" watchObservedRunningTime="2025-11-26 12:25:40.804707351 +0000 UTC m=+838.711920703" Nov 26 12:25:40 crc kubenswrapper[4834]: I1126 12:25:40.824047 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.873983865 podStartE2EDuration="52.824032675s" podCreationTimestamp="2025-11-26 12:24:48 +0000 UTC" firstStartedPulling="2025-11-26 12:24:58.382412485 +0000 UTC m=+796.289625837" lastFinishedPulling="2025-11-26 12:25:05.332461295 +0000 UTC m=+803.239674647" observedRunningTime="2025-11-26 12:25:40.818694086 +0000 UTC m=+838.725907439" watchObservedRunningTime="2025-11-26 12:25:40.824032675 +0000 UTC m=+838.731246027" Nov 26 12:25:43 crc kubenswrapper[4834]: I1126 12:25:43.958142 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-n6tm2" podUID="b0ec582c-ead4-4350-ac7f-530f80804717" containerName="ovn-controller" probeResult="failure" output=< Nov 26 12:25:43 crc kubenswrapper[4834]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 26 12:25:43 crc kubenswrapper[4834]: > Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.041619 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7b48f" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.231804 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-n6tm2-config-x5pdw"] Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.232810 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.234190 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.247228 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-n6tm2-config-x5pdw"] Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.307222 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.307262 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run-ovn\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.307331 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-additional-scripts\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.307355 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xfjt\" (UniqueName: \"kubernetes.io/projected/8ae29276-1024-4fd0-9aee-25c0fd79e086-kube-api-access-4xfjt\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.307626 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-log-ovn\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.307813 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-scripts\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.409304 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run-ovn\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.409428 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-additional-scripts\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.409456 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xfjt\" (UniqueName: \"kubernetes.io/projected/8ae29276-1024-4fd0-9aee-25c0fd79e086-kube-api-access-4xfjt\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.409537 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-log-ovn\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.409590 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-scripts\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.409609 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.409777 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run-ovn\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.409837 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.409860 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-log-ovn\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.410159 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-additional-scripts\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.411493 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-scripts\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.428390 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xfjt\" (UniqueName: \"kubernetes.io/projected/8ae29276-1024-4fd0-9aee-25c0fd79e086-kube-api-access-4xfjt\") pod \"ovn-controller-n6tm2-config-x5pdw\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.546529 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.810582 4834 generic.go:334] "Generic (PLEG): container finished" podID="6d56ce8c-1412-4272-8905-e251251f4f64" containerID="b0909226f01908f9f2ded595fced81eb1c80fe80c1f5e098da1d11534c07302e" exitCode=0 Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.810673 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t2gkm" event={"ID":"6d56ce8c-1412-4272-8905-e251251f4f64","Type":"ContainerDied","Data":"b0909226f01908f9f2ded595fced81eb1c80fe80c1f5e098da1d11534c07302e"} Nov 26 12:25:44 crc kubenswrapper[4834]: I1126 12:25:44.932127 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-n6tm2-config-x5pdw"] Nov 26 12:25:44 crc kubenswrapper[4834]: W1126 12:25:44.936957 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae29276_1024_4fd0_9aee_25c0fd79e086.slice/crio-31ff259c2c8028bbc69c57e1d493bf465262d61d2c2299c575f5acb9239d37b7 WatchSource:0}: Error finding container 31ff259c2c8028bbc69c57e1d493bf465262d61d2c2299c575f5acb9239d37b7: Status 404 returned error can't find the container with id 31ff259c2c8028bbc69c57e1d493bf465262d61d2c2299c575f5acb9239d37b7 Nov 26 12:25:45 crc kubenswrapper[4834]: I1126 12:25:45.819325 4834 generic.go:334] "Generic (PLEG): container finished" podID="8ae29276-1024-4fd0-9aee-25c0fd79e086" containerID="c720a02f9bff5b293043fe87ba72ad95273d785cbff5ff2a9ef4649e77c00aab" exitCode=0 Nov 26 12:25:45 crc kubenswrapper[4834]: I1126 12:25:45.819378 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-n6tm2-config-x5pdw" event={"ID":"8ae29276-1024-4fd0-9aee-25c0fd79e086","Type":"ContainerDied","Data":"c720a02f9bff5b293043fe87ba72ad95273d785cbff5ff2a9ef4649e77c00aab"} Nov 26 12:25:45 crc kubenswrapper[4834]: I1126 12:25:45.819838 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-n6tm2-config-x5pdw" event={"ID":"8ae29276-1024-4fd0-9aee-25c0fd79e086","Type":"ContainerStarted","Data":"31ff259c2c8028bbc69c57e1d493bf465262d61d2c2299c575f5acb9239d37b7"} Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.123101 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.240508 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-combined-ca-bundle\") pod \"6d56ce8c-1412-4272-8905-e251251f4f64\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.240843 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-db-sync-config-data\") pod \"6d56ce8c-1412-4272-8905-e251251f4f64\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.240871 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qdj2\" (UniqueName: \"kubernetes.io/projected/6d56ce8c-1412-4272-8905-e251251f4f64-kube-api-access-2qdj2\") pod \"6d56ce8c-1412-4272-8905-e251251f4f64\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.240958 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-config-data\") pod \"6d56ce8c-1412-4272-8905-e251251f4f64\" (UID: \"6d56ce8c-1412-4272-8905-e251251f4f64\") " Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.258514 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d56ce8c-1412-4272-8905-e251251f4f64-kube-api-access-2qdj2" (OuterVolumeSpecName: "kube-api-access-2qdj2") pod "6d56ce8c-1412-4272-8905-e251251f4f64" (UID: "6d56ce8c-1412-4272-8905-e251251f4f64"). InnerVolumeSpecName "kube-api-access-2qdj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.259182 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d56ce8c-1412-4272-8905-e251251f4f64" (UID: "6d56ce8c-1412-4272-8905-e251251f4f64"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.264293 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d56ce8c-1412-4272-8905-e251251f4f64" (UID: "6d56ce8c-1412-4272-8905-e251251f4f64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.282242 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-config-data" (OuterVolumeSpecName: "config-data") pod "6d56ce8c-1412-4272-8905-e251251f4f64" (UID: "6d56ce8c-1412-4272-8905-e251251f4f64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.344410 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qdj2\" (UniqueName: \"kubernetes.io/projected/6d56ce8c-1412-4272-8905-e251251f4f64-kube-api-access-2qdj2\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.344458 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.344473 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.344485 4834 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d56ce8c-1412-4272-8905-e251251f4f64-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.826920 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t2gkm" event={"ID":"6d56ce8c-1412-4272-8905-e251251f4f64","Type":"ContainerDied","Data":"da33997228dbbd0a5e8a9791ab5c6947cf354a96150e00baa76e91feea936380"} Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.826985 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da33997228dbbd0a5e8a9791ab5c6947cf354a96150e00baa76e91feea936380" Nov 26 12:25:46 crc kubenswrapper[4834]: I1126 12:25:46.826994 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t2gkm" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.067362 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.146967 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75b58765b5-2sx2l"] Nov 26 12:25:47 crc kubenswrapper[4834]: E1126 12:25:47.147253 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae29276-1024-4fd0-9aee-25c0fd79e086" containerName="ovn-config" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.147265 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae29276-1024-4fd0-9aee-25c0fd79e086" containerName="ovn-config" Nov 26 12:25:47 crc kubenswrapper[4834]: E1126 12:25:47.147284 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d56ce8c-1412-4272-8905-e251251f4f64" containerName="glance-db-sync" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.147290 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d56ce8c-1412-4272-8905-e251251f4f64" containerName="glance-db-sync" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.148747 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d56ce8c-1412-4272-8905-e251251f4f64" containerName="glance-db-sync" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.148771 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae29276-1024-4fd0-9aee-25c0fd79e086" containerName="ovn-config" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.149531 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.157126 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-additional-scripts\") pod \"8ae29276-1024-4fd0-9aee-25c0fd79e086\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.157199 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run-ovn\") pod \"8ae29276-1024-4fd0-9aee-25c0fd79e086\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.157278 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-scripts\") pod \"8ae29276-1024-4fd0-9aee-25c0fd79e086\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.157350 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xfjt\" (UniqueName: \"kubernetes.io/projected/8ae29276-1024-4fd0-9aee-25c0fd79e086-kube-api-access-4xfjt\") pod \"8ae29276-1024-4fd0-9aee-25c0fd79e086\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.157370 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-log-ovn\") pod \"8ae29276-1024-4fd0-9aee-25c0fd79e086\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.157404 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run\") pod \"8ae29276-1024-4fd0-9aee-25c0fd79e086\" (UID: \"8ae29276-1024-4fd0-9aee-25c0fd79e086\") " Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.157749 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run" (OuterVolumeSpecName: "var-run") pod "8ae29276-1024-4fd0-9aee-25c0fd79e086" (UID: "8ae29276-1024-4fd0-9aee-25c0fd79e086"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.158635 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8ae29276-1024-4fd0-9aee-25c0fd79e086" (UID: "8ae29276-1024-4fd0-9aee-25c0fd79e086"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.158666 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8ae29276-1024-4fd0-9aee-25c0fd79e086" (UID: "8ae29276-1024-4fd0-9aee-25c0fd79e086"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.159198 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-scripts" (OuterVolumeSpecName: "scripts") pod "8ae29276-1024-4fd0-9aee-25c0fd79e086" (UID: "8ae29276-1024-4fd0-9aee-25c0fd79e086"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.160188 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8ae29276-1024-4fd0-9aee-25c0fd79e086" (UID: "8ae29276-1024-4fd0-9aee-25c0fd79e086"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.162871 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae29276-1024-4fd0-9aee-25c0fd79e086-kube-api-access-4xfjt" (OuterVolumeSpecName: "kube-api-access-4xfjt") pod "8ae29276-1024-4fd0-9aee-25c0fd79e086" (UID: "8ae29276-1024-4fd0-9aee-25c0fd79e086"). InnerVolumeSpecName "kube-api-access-4xfjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.174099 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b58765b5-2sx2l"] Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259136 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-sb\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259262 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-dns-svc\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259292 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-nb\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259331 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-config\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259363 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x75vl\" (UniqueName: \"kubernetes.io/projected/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-kube-api-access-x75vl\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259425 4834 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259437 4834 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259446 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae29276-1024-4fd0-9aee-25c0fd79e086-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259454 4834 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259462 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xfjt\" (UniqueName: \"kubernetes.io/projected/8ae29276-1024-4fd0-9aee-25c0fd79e086-kube-api-access-4xfjt\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.259471 4834 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae29276-1024-4fd0-9aee-25c0fd79e086-var-run\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.360673 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-dns-svc\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.360721 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-nb\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.360745 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-config\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.360768 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x75vl\" (UniqueName: \"kubernetes.io/projected/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-kube-api-access-x75vl\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.360802 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-sb\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.361597 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-sb\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.362059 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-config\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.362135 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-dns-svc\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.362431 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-nb\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.389122 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x75vl\" (UniqueName: \"kubernetes.io/projected/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-kube-api-access-x75vl\") pod \"dnsmasq-dns-75b58765b5-2sx2l\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.466571 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.833580 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-n6tm2-config-x5pdw" event={"ID":"8ae29276-1024-4fd0-9aee-25c0fd79e086","Type":"ContainerDied","Data":"31ff259c2c8028bbc69c57e1d493bf465262d61d2c2299c575f5acb9239d37b7"} Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.833926 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ff259c2c8028bbc69c57e1d493bf465262d61d2c2299c575f5acb9239d37b7" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.833622 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-n6tm2-config-x5pdw" Nov 26 12:25:47 crc kubenswrapper[4834]: I1126 12:25:47.889001 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b58765b5-2sx2l"] Nov 26 12:25:47 crc kubenswrapper[4834]: W1126 12:25:47.890828 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod722cd8b2_0b69_4de7_8d3c_8f794dafb9a5.slice/crio-196f23abfc82092ea1b30a492cd702d20139b45e360c129f92a77134f1abdfb7 WatchSource:0}: Error finding container 196f23abfc82092ea1b30a492cd702d20139b45e360c129f92a77134f1abdfb7: Status 404 returned error can't find the container with id 196f23abfc82092ea1b30a492cd702d20139b45e360c129f92a77134f1abdfb7 Nov 26 12:25:48 crc kubenswrapper[4834]: I1126 12:25:48.174920 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-n6tm2-config-x5pdw"] Nov 26 12:25:48 crc kubenswrapper[4834]: I1126 12:25:48.182575 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-n6tm2-config-x5pdw"] Nov 26 12:25:48 crc kubenswrapper[4834]: I1126 12:25:48.425566 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae29276-1024-4fd0-9aee-25c0fd79e086" path="/var/lib/kubelet/pods/8ae29276-1024-4fd0-9aee-25c0fd79e086/volumes" Nov 26 12:25:48 crc kubenswrapper[4834]: I1126 12:25:48.846744 4834 generic.go:334] "Generic (PLEG): container finished" podID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerID="5c52e0b9faa634b6d7453e8992ee6fbedf1d0af971e81430138d6b67979f4144" exitCode=0 Nov 26 12:25:48 crc kubenswrapper[4834]: I1126 12:25:48.846854 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" event={"ID":"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5","Type":"ContainerDied","Data":"5c52e0b9faa634b6d7453e8992ee6fbedf1d0af971e81430138d6b67979f4144"} Nov 26 12:25:48 crc kubenswrapper[4834]: I1126 12:25:48.847137 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" event={"ID":"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5","Type":"ContainerStarted","Data":"196f23abfc82092ea1b30a492cd702d20139b45e360c129f92a77134f1abdfb7"} Nov 26 12:25:48 crc kubenswrapper[4834]: I1126 12:25:48.985354 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-n6tm2" Nov 26 12:25:49 crc kubenswrapper[4834]: I1126 12:25:49.858240 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" event={"ID":"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5","Type":"ContainerStarted","Data":"27bae8479412e3408c968208fbf63e1480b1083adf97a9c52ac6329d7364ed93"} Nov 26 12:25:49 crc kubenswrapper[4834]: I1126 12:25:49.858725 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:49 crc kubenswrapper[4834]: I1126 12:25:49.877057 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" podStartSLOduration=2.877030334 podStartE2EDuration="2.877030334s" podCreationTimestamp="2025-11-26 12:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:25:49.873729749 +0000 UTC m=+847.780943101" watchObservedRunningTime="2025-11-26 12:25:49.877030334 +0000 UTC m=+847.784243687" Nov 26 12:25:50 crc kubenswrapper[4834]: I1126 12:25:50.344525 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:25:50 crc kubenswrapper[4834]: I1126 12:25:50.381480 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.788784 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8q49t"] Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.790018 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.802189 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8q49t"] Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.847697 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16ebb8de-ce70-4164-8375-945db41f55e4-operator-scripts\") pod \"cinder-db-create-8q49t\" (UID: \"16ebb8de-ce70-4164-8375-945db41f55e4\") " pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.847800 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sq9t\" (UniqueName: \"kubernetes.io/projected/16ebb8de-ce70-4164-8375-945db41f55e4-kube-api-access-2sq9t\") pod \"cinder-db-create-8q49t\" (UID: \"16ebb8de-ce70-4164-8375-945db41f55e4\") " pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.899251 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d03a-account-create-update-2ttnl"] Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.900176 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.901861 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.912659 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qmhct"] Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.913497 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.922356 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d03a-account-create-update-2ttnl"] Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.927866 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qmhct"] Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.949020 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-497h9\" (UniqueName: \"kubernetes.io/projected/c681553c-78af-4a9a-bcf1-a03424706e78-kube-api-access-497h9\") pod \"barbican-d03a-account-create-update-2ttnl\" (UID: \"c681553c-78af-4a9a-bcf1-a03424706e78\") " pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.949218 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zms7v\" (UniqueName: \"kubernetes.io/projected/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-kube-api-access-zms7v\") pod \"barbican-db-create-qmhct\" (UID: \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\") " pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.949304 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c681553c-78af-4a9a-bcf1-a03424706e78-operator-scripts\") pod \"barbican-d03a-account-create-update-2ttnl\" (UID: \"c681553c-78af-4a9a-bcf1-a03424706e78\") " pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.949425 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16ebb8de-ce70-4164-8375-945db41f55e4-operator-scripts\") pod \"cinder-db-create-8q49t\" (UID: \"16ebb8de-ce70-4164-8375-945db41f55e4\") " pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.949488 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-operator-scripts\") pod \"barbican-db-create-qmhct\" (UID: \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\") " pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.949609 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sq9t\" (UniqueName: \"kubernetes.io/projected/16ebb8de-ce70-4164-8375-945db41f55e4-kube-api-access-2sq9t\") pod \"cinder-db-create-8q49t\" (UID: \"16ebb8de-ce70-4164-8375-945db41f55e4\") " pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.950067 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16ebb8de-ce70-4164-8375-945db41f55e4-operator-scripts\") pod \"cinder-db-create-8q49t\" (UID: \"16ebb8de-ce70-4164-8375-945db41f55e4\") " pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:51 crc kubenswrapper[4834]: I1126 12:25:51.981540 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sq9t\" (UniqueName: \"kubernetes.io/projected/16ebb8de-ce70-4164-8375-945db41f55e4-kube-api-access-2sq9t\") pod \"cinder-db-create-8q49t\" (UID: \"16ebb8de-ce70-4164-8375-945db41f55e4\") " pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.020684 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-04bc-account-create-update-5k2d8"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.021497 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.031279 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.036723 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-04bc-account-create-update-5k2d8"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.054097 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfv8l\" (UniqueName: \"kubernetes.io/projected/e912bd72-a05d-4456-92f8-42a5eca2621b-kube-api-access-vfv8l\") pod \"cinder-04bc-account-create-update-5k2d8\" (UID: \"e912bd72-a05d-4456-92f8-42a5eca2621b\") " pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.054149 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-operator-scripts\") pod \"barbican-db-create-qmhct\" (UID: \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\") " pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.054406 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e912bd72-a05d-4456-92f8-42a5eca2621b-operator-scripts\") pod \"cinder-04bc-account-create-update-5k2d8\" (UID: \"e912bd72-a05d-4456-92f8-42a5eca2621b\") " pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.054544 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-497h9\" (UniqueName: \"kubernetes.io/projected/c681553c-78af-4a9a-bcf1-a03424706e78-kube-api-access-497h9\") pod \"barbican-d03a-account-create-update-2ttnl\" (UID: \"c681553c-78af-4a9a-bcf1-a03424706e78\") " pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.054656 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zms7v\" (UniqueName: \"kubernetes.io/projected/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-kube-api-access-zms7v\") pod \"barbican-db-create-qmhct\" (UID: \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\") " pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.054694 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c681553c-78af-4a9a-bcf1-a03424706e78-operator-scripts\") pod \"barbican-d03a-account-create-update-2ttnl\" (UID: \"c681553c-78af-4a9a-bcf1-a03424706e78\") " pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.054728 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-operator-scripts\") pod \"barbican-db-create-qmhct\" (UID: \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\") " pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.055330 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c681553c-78af-4a9a-bcf1-a03424706e78-operator-scripts\") pod \"barbican-d03a-account-create-update-2ttnl\" (UID: \"c681553c-78af-4a9a-bcf1-a03424706e78\") " pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.076843 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zms7v\" (UniqueName: \"kubernetes.io/projected/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-kube-api-access-zms7v\") pod \"barbican-db-create-qmhct\" (UID: \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\") " pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.077353 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-497h9\" (UniqueName: \"kubernetes.io/projected/c681553c-78af-4a9a-bcf1-a03424706e78-kube-api-access-497h9\") pod \"barbican-d03a-account-create-update-2ttnl\" (UID: \"c681553c-78af-4a9a-bcf1-a03424706e78\") " pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.115235 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7742r"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.116110 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7742r" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.116972 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.139005 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7742r"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.155693 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-operator-scripts\") pod \"neutron-db-create-7742r\" (UID: \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\") " pod="openstack/neutron-db-create-7742r" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.155734 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwkb8\" (UniqueName: \"kubernetes.io/projected/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-kube-api-access-rwkb8\") pod \"neutron-db-create-7742r\" (UID: \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\") " pod="openstack/neutron-db-create-7742r" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.155803 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfv8l\" (UniqueName: \"kubernetes.io/projected/e912bd72-a05d-4456-92f8-42a5eca2621b-kube-api-access-vfv8l\") pod \"cinder-04bc-account-create-update-5k2d8\" (UID: \"e912bd72-a05d-4456-92f8-42a5eca2621b\") " pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.155873 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e912bd72-a05d-4456-92f8-42a5eca2621b-operator-scripts\") pod \"cinder-04bc-account-create-update-5k2d8\" (UID: \"e912bd72-a05d-4456-92f8-42a5eca2621b\") " pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.156459 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e912bd72-a05d-4456-92f8-42a5eca2621b-operator-scripts\") pod \"cinder-04bc-account-create-update-5k2d8\" (UID: \"e912bd72-a05d-4456-92f8-42a5eca2621b\") " pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.174812 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfv8l\" (UniqueName: \"kubernetes.io/projected/e912bd72-a05d-4456-92f8-42a5eca2621b-kube-api-access-vfv8l\") pod \"cinder-04bc-account-create-update-5k2d8\" (UID: \"e912bd72-a05d-4456-92f8-42a5eca2621b\") " pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.213642 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.214271 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9p5t2"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.215066 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.219671 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.219924 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.220114 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.220270 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8frzm" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.224253 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.232531 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9p5t2"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.257097 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-combined-ca-bundle\") pod \"keystone-db-sync-9p5t2\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.257327 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-operator-scripts\") pod \"neutron-db-create-7742r\" (UID: \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\") " pod="openstack/neutron-db-create-7742r" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.257361 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwkb8\" (UniqueName: \"kubernetes.io/projected/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-kube-api-access-rwkb8\") pod \"neutron-db-create-7742r\" (UID: \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\") " pod="openstack/neutron-db-create-7742r" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.257395 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-config-data\") pod \"keystone-db-sync-9p5t2\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.257444 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzvpq\" (UniqueName: \"kubernetes.io/projected/1a0e6cda-4565-4342-b83f-39df9cdd4207-kube-api-access-zzvpq\") pod \"keystone-db-sync-9p5t2\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.258010 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-operator-scripts\") pod \"neutron-db-create-7742r\" (UID: \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\") " pod="openstack/neutron-db-create-7742r" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.275777 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwkb8\" (UniqueName: \"kubernetes.io/projected/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-kube-api-access-rwkb8\") pod \"neutron-db-create-7742r\" (UID: \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\") " pod="openstack/neutron-db-create-7742r" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.316874 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0e4b-account-create-update-z6xkv"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.318200 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.319944 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.325948 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0e4b-account-create-update-z6xkv"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.340394 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.358636 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-config-data\") pod \"keystone-db-sync-9p5t2\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.358700 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzvpq\" (UniqueName: \"kubernetes.io/projected/1a0e6cda-4565-4342-b83f-39df9cdd4207-kube-api-access-zzvpq\") pod \"keystone-db-sync-9p5t2\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.358746 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-operator-scripts\") pod \"neutron-0e4b-account-create-update-z6xkv\" (UID: \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\") " pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.358770 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqpvf\" (UniqueName: \"kubernetes.io/projected/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-kube-api-access-bqpvf\") pod \"neutron-0e4b-account-create-update-z6xkv\" (UID: \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\") " pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.358801 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-combined-ca-bundle\") pod \"keystone-db-sync-9p5t2\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.361930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-combined-ca-bundle\") pod \"keystone-db-sync-9p5t2\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.369848 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-config-data\") pod \"keystone-db-sync-9p5t2\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.377848 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzvpq\" (UniqueName: \"kubernetes.io/projected/1a0e6cda-4565-4342-b83f-39df9cdd4207-kube-api-access-zzvpq\") pod \"keystone-db-sync-9p5t2\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.459906 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-operator-scripts\") pod \"neutron-0e4b-account-create-update-z6xkv\" (UID: \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\") " pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.459962 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqpvf\" (UniqueName: \"kubernetes.io/projected/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-kube-api-access-bqpvf\") pod \"neutron-0e4b-account-create-update-z6xkv\" (UID: \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\") " pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.460640 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-operator-scripts\") pod \"neutron-0e4b-account-create-update-z6xkv\" (UID: \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\") " pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.480090 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqpvf\" (UniqueName: \"kubernetes.io/projected/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-kube-api-access-bqpvf\") pod \"neutron-0e4b-account-create-update-z6xkv\" (UID: \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\") " pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.507599 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7742r" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.543674 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.633828 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.635952 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8q49t"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.734226 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d03a-account-create-update-2ttnl"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.752492 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qmhct"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.871023 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-04bc-account-create-update-5k2d8"] Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.884399 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d03a-account-create-update-2ttnl" event={"ID":"c681553c-78af-4a9a-bcf1-a03424706e78","Type":"ContainerStarted","Data":"cff937b70429181c1fcbad6323c4706c920456e6ed8173f4be49bf0b065c92b6"} Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.887036 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qmhct" event={"ID":"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef","Type":"ContainerStarted","Data":"fe3659554d0bf545881f7fb5e60019b6d6d2acd3a92c5ce512ac7e8337493a25"} Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.891275 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8q49t" event={"ID":"16ebb8de-ce70-4164-8375-945db41f55e4","Type":"ContainerStarted","Data":"751c2e04de4e897361d5bc61b03c3ed78e5c4b5558a3751f8654ae5c125ba76f"} Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.891335 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8q49t" event={"ID":"16ebb8de-ce70-4164-8375-945db41f55e4","Type":"ContainerStarted","Data":"dc33177cd6fe84bb660f915bd34d7e2f0968da723f3ad63f24af14a9e6d15103"} Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.903925 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-8q49t" podStartSLOduration=1.90391489 podStartE2EDuration="1.90391489s" podCreationTimestamp="2025-11-26 12:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:25:52.902135779 +0000 UTC m=+850.809349131" watchObservedRunningTime="2025-11-26 12:25:52.90391489 +0000 UTC m=+850.811128242" Nov 26 12:25:52 crc kubenswrapper[4834]: I1126 12:25:52.951117 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7742r"] Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.023259 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9p5t2"] Nov 26 12:25:53 crc kubenswrapper[4834]: W1126 12:25:53.033002 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a0e6cda_4565_4342_b83f_39df9cdd4207.slice/crio-91e6af03f101a770915731e80b7445e97001579ca2eb03413247fa00e26de4e6 WatchSource:0}: Error finding container 91e6af03f101a770915731e80b7445e97001579ca2eb03413247fa00e26de4e6: Status 404 returned error can't find the container with id 91e6af03f101a770915731e80b7445e97001579ca2eb03413247fa00e26de4e6 Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.109902 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0e4b-account-create-update-z6xkv"] Nov 26 12:25:53 crc kubenswrapper[4834]: W1126 12:25:53.145707 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8842d2_f5fc_49bc_a2de_2b71d41fd6a6.slice/crio-85d36dfedf16c3c10efb3b3f5668ee1b7cd699ed4bb502ffa18fa9727b8c17cc WatchSource:0}: Error finding container 85d36dfedf16c3c10efb3b3f5668ee1b7cd699ed4bb502ffa18fa9727b8c17cc: Status 404 returned error can't find the container with id 85d36dfedf16c3c10efb3b3f5668ee1b7cd699ed4bb502ffa18fa9727b8c17cc Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.903342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9p5t2" event={"ID":"1a0e6cda-4565-4342-b83f-39df9cdd4207","Type":"ContainerStarted","Data":"91e6af03f101a770915731e80b7445e97001579ca2eb03413247fa00e26de4e6"} Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.904721 4834 generic.go:334] "Generic (PLEG): container finished" podID="e912bd72-a05d-4456-92f8-42a5eca2621b" containerID="a89add7f786792e2eb6b0f82ac3f6a2baf92706d68b30113fefc9c08e6a68a32" exitCode=0 Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.904783 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-04bc-account-create-update-5k2d8" event={"ID":"e912bd72-a05d-4456-92f8-42a5eca2621b","Type":"ContainerDied","Data":"a89add7f786792e2eb6b0f82ac3f6a2baf92706d68b30113fefc9c08e6a68a32"} Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.904810 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-04bc-account-create-update-5k2d8" event={"ID":"e912bd72-a05d-4456-92f8-42a5eca2621b","Type":"ContainerStarted","Data":"44a51db49cd5e2b02af006193545cf64ebd9e32b001054be156b34d60de149fe"} Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.906458 4834 generic.go:334] "Generic (PLEG): container finished" podID="16ebb8de-ce70-4164-8375-945db41f55e4" containerID="751c2e04de4e897361d5bc61b03c3ed78e5c4b5558a3751f8654ae5c125ba76f" exitCode=0 Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.906516 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8q49t" event={"ID":"16ebb8de-ce70-4164-8375-945db41f55e4","Type":"ContainerDied","Data":"751c2e04de4e897361d5bc61b03c3ed78e5c4b5558a3751f8654ae5c125ba76f"} Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.908766 4834 generic.go:334] "Generic (PLEG): container finished" podID="c681553c-78af-4a9a-bcf1-a03424706e78" containerID="0d0e0711aeb57d676157a6ba95bf757efc273fdf0bed1f12bcd6556a7998ef13" exitCode=0 Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.908831 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d03a-account-create-update-2ttnl" event={"ID":"c681553c-78af-4a9a-bcf1-a03424706e78","Type":"ContainerDied","Data":"0d0e0711aeb57d676157a6ba95bf757efc273fdf0bed1f12bcd6556a7998ef13"} Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.910627 4834 generic.go:334] "Generic (PLEG): container finished" podID="09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6" containerID="32f6e03b9de3c2ed3c5798007a388c3d7cc327119438ee13cc0406f46ddd0c4b" exitCode=0 Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.910682 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7742r" event={"ID":"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6","Type":"ContainerDied","Data":"32f6e03b9de3c2ed3c5798007a388c3d7cc327119438ee13cc0406f46ddd0c4b"} Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.910699 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7742r" event={"ID":"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6","Type":"ContainerStarted","Data":"1c9753f79d0e3d1f2202d5a26927fb1e206d7306d9ba31f6a91a7a1a61269cdf"} Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.912014 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qmhct" event={"ID":"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef","Type":"ContainerDied","Data":"4c73203051b391700f57910c10d27f99f5acadaf3336763b2e0f10478367f55a"} Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.911946 4834 generic.go:334] "Generic (PLEG): container finished" podID="a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef" containerID="4c73203051b391700f57910c10d27f99f5acadaf3336763b2e0f10478367f55a" exitCode=0 Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.917306 4834 generic.go:334] "Generic (PLEG): container finished" podID="ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6" containerID="2bf4127e75887f916ec9f7fb22b34b8a25180ec24233c82c3eb8867ad84444f1" exitCode=0 Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.917363 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e4b-account-create-update-z6xkv" event={"ID":"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6","Type":"ContainerDied","Data":"2bf4127e75887f916ec9f7fb22b34b8a25180ec24233c82c3eb8867ad84444f1"} Nov 26 12:25:53 crc kubenswrapper[4834]: I1126 12:25:53.917398 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e4b-account-create-update-z6xkv" event={"ID":"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6","Type":"ContainerStarted","Data":"85d36dfedf16c3c10efb3b3f5668ee1b7cd699ed4bb502ffa18fa9727b8c17cc"} Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.923099 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.929407 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.936759 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.963203 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zms7v\" (UniqueName: \"kubernetes.io/projected/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-kube-api-access-zms7v\") pod \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\" (UID: \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\") " Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.963327 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-operator-scripts\") pod \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\" (UID: \"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef\") " Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.963802 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qmhct" event={"ID":"a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef","Type":"ContainerDied","Data":"fe3659554d0bf545881f7fb5e60019b6d6d2acd3a92c5ce512ac7e8337493a25"} Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.963844 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe3659554d0bf545881f7fb5e60019b6d6d2acd3a92c5ce512ac7e8337493a25" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.964010 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qmhct" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.966281 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.966456 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef" (UID: "a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.966708 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e4b-account-create-update-z6xkv" event={"ID":"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6","Type":"ContainerDied","Data":"85d36dfedf16c3c10efb3b3f5668ee1b7cd699ed4bb502ffa18fa9727b8c17cc"} Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.966804 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85d36dfedf16c3c10efb3b3f5668ee1b7cd699ed4bb502ffa18fa9727b8c17cc" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.966920 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e4b-account-create-update-z6xkv" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.975492 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-04bc-account-create-update-5k2d8" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.975678 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-04bc-account-create-update-5k2d8" event={"ID":"e912bd72-a05d-4456-92f8-42a5eca2621b","Type":"ContainerDied","Data":"44a51db49cd5e2b02af006193545cf64ebd9e32b001054be156b34d60de149fe"} Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.975725 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a51db49cd5e2b02af006193545cf64ebd9e32b001054be156b34d60de149fe" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.979203 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-kube-api-access-zms7v" (OuterVolumeSpecName: "kube-api-access-zms7v") pod "a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef" (UID: "a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef"). InnerVolumeSpecName "kube-api-access-zms7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.979640 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7742r" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.979906 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8q49t" event={"ID":"16ebb8de-ce70-4164-8375-945db41f55e4","Type":"ContainerDied","Data":"dc33177cd6fe84bb660f915bd34d7e2f0968da723f3ad63f24af14a9e6d15103"} Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.979957 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc33177cd6fe84bb660f915bd34d7e2f0968da723f3ad63f24af14a9e6d15103" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.980006 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8q49t" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.984406 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d03a-account-create-update-2ttnl" event={"ID":"c681553c-78af-4a9a-bcf1-a03424706e78","Type":"ContainerDied","Data":"cff937b70429181c1fcbad6323c4706c920456e6ed8173f4be49bf0b065c92b6"} Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.984436 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff937b70429181c1fcbad6323c4706c920456e6ed8173f4be49bf0b065c92b6" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.986402 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7742r" event={"ID":"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6","Type":"ContainerDied","Data":"1c9753f79d0e3d1f2202d5a26927fb1e206d7306d9ba31f6a91a7a1a61269cdf"} Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.986427 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c9753f79d0e3d1f2202d5a26927fb1e206d7306d9ba31f6a91a7a1a61269cdf" Nov 26 12:25:56 crc kubenswrapper[4834]: I1126 12:25:56.986591 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7742r" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.020355 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.064792 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-operator-scripts\") pod \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\" (UID: \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.064859 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16ebb8de-ce70-4164-8375-945db41f55e4-operator-scripts\") pod \"16ebb8de-ce70-4164-8375-945db41f55e4\" (UID: \"16ebb8de-ce70-4164-8375-945db41f55e4\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.064984 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sq9t\" (UniqueName: \"kubernetes.io/projected/16ebb8de-ce70-4164-8375-945db41f55e4-kube-api-access-2sq9t\") pod \"16ebb8de-ce70-4164-8375-945db41f55e4\" (UID: \"16ebb8de-ce70-4164-8375-945db41f55e4\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065033 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e912bd72-a05d-4456-92f8-42a5eca2621b-operator-scripts\") pod \"e912bd72-a05d-4456-92f8-42a5eca2621b\" (UID: \"e912bd72-a05d-4456-92f8-42a5eca2621b\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065070 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqpvf\" (UniqueName: \"kubernetes.io/projected/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-kube-api-access-bqpvf\") pod \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\" (UID: \"ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065162 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-operator-scripts\") pod \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\" (UID: \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065195 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-497h9\" (UniqueName: \"kubernetes.io/projected/c681553c-78af-4a9a-bcf1-a03424706e78-kube-api-access-497h9\") pod \"c681553c-78af-4a9a-bcf1-a03424706e78\" (UID: \"c681553c-78af-4a9a-bcf1-a03424706e78\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065214 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6" (UID: "ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065238 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwkb8\" (UniqueName: \"kubernetes.io/projected/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-kube-api-access-rwkb8\") pod \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\" (UID: \"09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065320 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfv8l\" (UniqueName: \"kubernetes.io/projected/e912bd72-a05d-4456-92f8-42a5eca2621b-kube-api-access-vfv8l\") pod \"e912bd72-a05d-4456-92f8-42a5eca2621b\" (UID: \"e912bd72-a05d-4456-92f8-42a5eca2621b\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065363 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c681553c-78af-4a9a-bcf1-a03424706e78-operator-scripts\") pod \"c681553c-78af-4a9a-bcf1-a03424706e78\" (UID: \"c681553c-78af-4a9a-bcf1-a03424706e78\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065481 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e912bd72-a05d-4456-92f8-42a5eca2621b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e912bd72-a05d-4456-92f8-42a5eca2621b" (UID: "e912bd72-a05d-4456-92f8-42a5eca2621b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065946 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065966 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zms7v\" (UniqueName: \"kubernetes.io/projected/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-kube-api-access-zms7v\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065979 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.065989 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e912bd72-a05d-4456-92f8-42a5eca2621b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.066118 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ebb8de-ce70-4164-8375-945db41f55e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16ebb8de-ce70-4164-8375-945db41f55e4" (UID: "16ebb8de-ce70-4164-8375-945db41f55e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.066594 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c681553c-78af-4a9a-bcf1-a03424706e78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c681553c-78af-4a9a-bcf1-a03424706e78" (UID: "c681553c-78af-4a9a-bcf1-a03424706e78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.066735 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6" (UID: "09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.069646 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ebb8de-ce70-4164-8375-945db41f55e4-kube-api-access-2sq9t" (OuterVolumeSpecName: "kube-api-access-2sq9t") pod "16ebb8de-ce70-4164-8375-945db41f55e4" (UID: "16ebb8de-ce70-4164-8375-945db41f55e4"). InnerVolumeSpecName "kube-api-access-2sq9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.069720 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-kube-api-access-bqpvf" (OuterVolumeSpecName: "kube-api-access-bqpvf") pod "ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6" (UID: "ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6"). InnerVolumeSpecName "kube-api-access-bqpvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.069831 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c681553c-78af-4a9a-bcf1-a03424706e78-kube-api-access-497h9" (OuterVolumeSpecName: "kube-api-access-497h9") pod "c681553c-78af-4a9a-bcf1-a03424706e78" (UID: "c681553c-78af-4a9a-bcf1-a03424706e78"). InnerVolumeSpecName "kube-api-access-497h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.069970 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e912bd72-a05d-4456-92f8-42a5eca2621b-kube-api-access-vfv8l" (OuterVolumeSpecName: "kube-api-access-vfv8l") pod "e912bd72-a05d-4456-92f8-42a5eca2621b" (UID: "e912bd72-a05d-4456-92f8-42a5eca2621b"). InnerVolumeSpecName "kube-api-access-vfv8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.075219 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-kube-api-access-rwkb8" (OuterVolumeSpecName: "kube-api-access-rwkb8") pod "09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6" (UID: "09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6"). InnerVolumeSpecName "kube-api-access-rwkb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.167181 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfv8l\" (UniqueName: \"kubernetes.io/projected/e912bd72-a05d-4456-92f8-42a5eca2621b-kube-api-access-vfv8l\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.167336 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c681553c-78af-4a9a-bcf1-a03424706e78-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.167407 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16ebb8de-ce70-4164-8375-945db41f55e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.167476 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sq9t\" (UniqueName: \"kubernetes.io/projected/16ebb8de-ce70-4164-8375-945db41f55e4-kube-api-access-2sq9t\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.167528 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqpvf\" (UniqueName: \"kubernetes.io/projected/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6-kube-api-access-bqpvf\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.167577 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.167632 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-497h9\" (UniqueName: \"kubernetes.io/projected/c681553c-78af-4a9a-bcf1-a03424706e78-kube-api-access-497h9\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.167687 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwkb8\" (UniqueName: \"kubernetes.io/projected/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6-kube-api-access-rwkb8\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.468793 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.521103 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-nxj5r"] Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.521406 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" podUID="2fcb2f84-372f-4136-a20b-79a4388eda80" containerName="dnsmasq-dns" containerID="cri-o://bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15" gracePeriod=10 Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.926568 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.989303 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcb4m\" (UniqueName: \"kubernetes.io/projected/2fcb2f84-372f-4136-a20b-79a4388eda80-kube-api-access-jcb4m\") pod \"2fcb2f84-372f-4136-a20b-79a4388eda80\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.989394 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-config\") pod \"2fcb2f84-372f-4136-a20b-79a4388eda80\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.989455 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-dns-svc\") pod \"2fcb2f84-372f-4136-a20b-79a4388eda80\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.989502 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-nb\") pod \"2fcb2f84-372f-4136-a20b-79a4388eda80\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " Nov 26 12:25:57 crc kubenswrapper[4834]: I1126 12:25:57.989548 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-sb\") pod \"2fcb2f84-372f-4136-a20b-79a4388eda80\" (UID: \"2fcb2f84-372f-4136-a20b-79a4388eda80\") " Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.017256 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fcb2f84-372f-4136-a20b-79a4388eda80-kube-api-access-jcb4m" (OuterVolumeSpecName: "kube-api-access-jcb4m") pod "2fcb2f84-372f-4136-a20b-79a4388eda80" (UID: "2fcb2f84-372f-4136-a20b-79a4388eda80"). InnerVolumeSpecName "kube-api-access-jcb4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.025621 4834 generic.go:334] "Generic (PLEG): container finished" podID="2fcb2f84-372f-4136-a20b-79a4388eda80" containerID="bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15" exitCode=0 Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.025682 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" event={"ID":"2fcb2f84-372f-4136-a20b-79a4388eda80","Type":"ContainerDied","Data":"bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15"} Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.025710 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" event={"ID":"2fcb2f84-372f-4136-a20b-79a4388eda80","Type":"ContainerDied","Data":"0faf434d91f53c1660db6e91d6d525772de6811053c9c72985aea33ebb6841ad"} Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.025726 4834 scope.go:117] "RemoveContainer" containerID="bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.025830 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6b5695-nxj5r" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.027165 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2fcb2f84-372f-4136-a20b-79a4388eda80" (UID: "2fcb2f84-372f-4136-a20b-79a4388eda80"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.029443 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d03a-account-create-update-2ttnl" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.029543 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9p5t2" event={"ID":"1a0e6cda-4565-4342-b83f-39df9cdd4207","Type":"ContainerStarted","Data":"02bc616cb82e30421c00aa45067e6c0f25fc87c2fbf7ff8078601aae94379266"} Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.048430 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-config" (OuterVolumeSpecName: "config") pod "2fcb2f84-372f-4136-a20b-79a4388eda80" (UID: "2fcb2f84-372f-4136-a20b-79a4388eda80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.056129 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9p5t2" podStartSLOduration=2.361082776 podStartE2EDuration="6.056084294s" podCreationTimestamp="2025-11-26 12:25:52 +0000 UTC" firstStartedPulling="2025-11-26 12:25:53.037065329 +0000 UTC m=+850.944278681" lastFinishedPulling="2025-11-26 12:25:56.732066847 +0000 UTC m=+854.639280199" observedRunningTime="2025-11-26 12:25:58.041828236 +0000 UTC m=+855.949041587" watchObservedRunningTime="2025-11-26 12:25:58.056084294 +0000 UTC m=+855.963297636" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.057963 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2fcb2f84-372f-4136-a20b-79a4388eda80" (UID: "2fcb2f84-372f-4136-a20b-79a4388eda80"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.061551 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2fcb2f84-372f-4136-a20b-79a4388eda80" (UID: "2fcb2f84-372f-4136-a20b-79a4388eda80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.068071 4834 scope.go:117] "RemoveContainer" containerID="16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.092140 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.092168 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.092179 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcb4m\" (UniqueName: \"kubernetes.io/projected/2fcb2f84-372f-4136-a20b-79a4388eda80-kube-api-access-jcb4m\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.092189 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.092198 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fcb2f84-372f-4136-a20b-79a4388eda80-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.096919 4834 scope.go:117] "RemoveContainer" containerID="bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15" Nov 26 12:25:58 crc kubenswrapper[4834]: E1126 12:25:58.097751 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15\": container with ID starting with bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15 not found: ID does not exist" containerID="bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.097797 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15"} err="failed to get container status \"bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15\": rpc error: code = NotFound desc = could not find container \"bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15\": container with ID starting with bcf54bc27b24c3d6172ca30da2a8418fd2ac7ad7d294489c52b8a2e0b5f69b15 not found: ID does not exist" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.097826 4834 scope.go:117] "RemoveContainer" containerID="16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064" Nov 26 12:25:58 crc kubenswrapper[4834]: E1126 12:25:58.098218 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064\": container with ID starting with 16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064 not found: ID does not exist" containerID="16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.098254 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064"} err="failed to get container status \"16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064\": rpc error: code = NotFound desc = could not find container \"16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064\": container with ID starting with 16d7ed8c82d138f983444d2b3f5b5e9739ec907540badd810d7c4afa8d8cd064 not found: ID does not exist" Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.353263 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-nxj5r"] Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.357643 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-nxj5r"] Nov 26 12:25:58 crc kubenswrapper[4834]: I1126 12:25:58.426548 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fcb2f84-372f-4136-a20b-79a4388eda80" path="/var/lib/kubelet/pods/2fcb2f84-372f-4136-a20b-79a4388eda80/volumes" Nov 26 12:25:59 crc kubenswrapper[4834]: I1126 12:25:59.037765 4834 generic.go:334] "Generic (PLEG): container finished" podID="1a0e6cda-4565-4342-b83f-39df9cdd4207" containerID="02bc616cb82e30421c00aa45067e6c0f25fc87c2fbf7ff8078601aae94379266" exitCode=0 Nov 26 12:25:59 crc kubenswrapper[4834]: I1126 12:25:59.037845 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9p5t2" event={"ID":"1a0e6cda-4565-4342-b83f-39df9cdd4207","Type":"ContainerDied","Data":"02bc616cb82e30421c00aa45067e6c0f25fc87c2fbf7ff8078601aae94379266"} Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.287546 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.431582 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzvpq\" (UniqueName: \"kubernetes.io/projected/1a0e6cda-4565-4342-b83f-39df9cdd4207-kube-api-access-zzvpq\") pod \"1a0e6cda-4565-4342-b83f-39df9cdd4207\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.431660 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-config-data\") pod \"1a0e6cda-4565-4342-b83f-39df9cdd4207\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.431686 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-combined-ca-bundle\") pod \"1a0e6cda-4565-4342-b83f-39df9cdd4207\" (UID: \"1a0e6cda-4565-4342-b83f-39df9cdd4207\") " Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.436808 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0e6cda-4565-4342-b83f-39df9cdd4207-kube-api-access-zzvpq" (OuterVolumeSpecName: "kube-api-access-zzvpq") pod "1a0e6cda-4565-4342-b83f-39df9cdd4207" (UID: "1a0e6cda-4565-4342-b83f-39df9cdd4207"). InnerVolumeSpecName "kube-api-access-zzvpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.454020 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a0e6cda-4565-4342-b83f-39df9cdd4207" (UID: "1a0e6cda-4565-4342-b83f-39df9cdd4207"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.464159 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-config-data" (OuterVolumeSpecName: "config-data") pod "1a0e6cda-4565-4342-b83f-39df9cdd4207" (UID: "1a0e6cda-4565-4342-b83f-39df9cdd4207"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.534173 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzvpq\" (UniqueName: \"kubernetes.io/projected/1a0e6cda-4565-4342-b83f-39df9cdd4207-kube-api-access-zzvpq\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.534208 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:00 crc kubenswrapper[4834]: I1126 12:26:00.534218 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0e6cda-4565-4342-b83f-39df9cdd4207-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.050162 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9p5t2" event={"ID":"1a0e6cda-4565-4342-b83f-39df9cdd4207","Type":"ContainerDied","Data":"91e6af03f101a770915731e80b7445e97001579ca2eb03413247fa00e26de4e6"} Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.050201 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91e6af03f101a770915731e80b7445e97001579ca2eb03413247fa00e26de4e6" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.050206 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9p5t2" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266160 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d44dbddd5-7lq7z"] Nov 26 12:26:01 crc kubenswrapper[4834]: E1126 12:26:01.266462 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6" containerName="mariadb-database-create" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266480 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6" containerName="mariadb-database-create" Nov 26 12:26:01 crc kubenswrapper[4834]: E1126 12:26:01.266499 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c681553c-78af-4a9a-bcf1-a03424706e78" containerName="mariadb-account-create-update" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266505 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c681553c-78af-4a9a-bcf1-a03424706e78" containerName="mariadb-account-create-update" Nov 26 12:26:01 crc kubenswrapper[4834]: E1126 12:26:01.266516 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef" containerName="mariadb-database-create" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266522 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef" containerName="mariadb-database-create" Nov 26 12:26:01 crc kubenswrapper[4834]: E1126 12:26:01.266531 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ebb8de-ce70-4164-8375-945db41f55e4" containerName="mariadb-database-create" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266536 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ebb8de-ce70-4164-8375-945db41f55e4" containerName="mariadb-database-create" Nov 26 12:26:01 crc kubenswrapper[4834]: E1126 12:26:01.266543 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0e6cda-4565-4342-b83f-39df9cdd4207" containerName="keystone-db-sync" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266549 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0e6cda-4565-4342-b83f-39df9cdd4207" containerName="keystone-db-sync" Nov 26 12:26:01 crc kubenswrapper[4834]: E1126 12:26:01.266557 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6" containerName="mariadb-account-create-update" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266563 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6" containerName="mariadb-account-create-update" Nov 26 12:26:01 crc kubenswrapper[4834]: E1126 12:26:01.266571 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e912bd72-a05d-4456-92f8-42a5eca2621b" containerName="mariadb-account-create-update" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266577 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e912bd72-a05d-4456-92f8-42a5eca2621b" containerName="mariadb-account-create-update" Nov 26 12:26:01 crc kubenswrapper[4834]: E1126 12:26:01.266586 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fcb2f84-372f-4136-a20b-79a4388eda80" containerName="dnsmasq-dns" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266593 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fcb2f84-372f-4136-a20b-79a4388eda80" containerName="dnsmasq-dns" Nov 26 12:26:01 crc kubenswrapper[4834]: E1126 12:26:01.266603 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fcb2f84-372f-4136-a20b-79a4388eda80" containerName="init" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266608 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fcb2f84-372f-4136-a20b-79a4388eda80" containerName="init" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266727 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6" containerName="mariadb-database-create" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266738 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0e6cda-4565-4342-b83f-39df9cdd4207" containerName="keystone-db-sync" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266750 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e912bd72-a05d-4456-92f8-42a5eca2621b" containerName="mariadb-account-create-update" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266760 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fcb2f84-372f-4136-a20b-79a4388eda80" containerName="dnsmasq-dns" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266769 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ebb8de-ce70-4164-8375-945db41f55e4" containerName="mariadb-database-create" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266776 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c681553c-78af-4a9a-bcf1-a03424706e78" containerName="mariadb-account-create-update" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266785 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef" containerName="mariadb-database-create" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.266792 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6" containerName="mariadb-account-create-update" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.267491 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.286150 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d44dbddd5-7lq7z"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.326391 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bx29h"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.327486 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.332896 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.333040 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.333331 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.334417 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.337925 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8frzm" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.345190 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-sb\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.345226 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxbxd\" (UniqueName: \"kubernetes.io/projected/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-kube-api-access-bxbxd\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.345255 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-config\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.345284 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-nb\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.345455 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-dns-svc\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.353445 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bx29h"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452575 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6f6c\" (UniqueName: \"kubernetes.io/projected/ae3faaec-add1-43f6-95ad-7cc26517afc3-kube-api-access-d6f6c\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452630 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-fernet-keys\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452675 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-sb\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452699 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxbxd\" (UniqueName: \"kubernetes.io/projected/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-kube-api-access-bxbxd\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452726 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-scripts\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452745 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-config\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452775 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-combined-ca-bundle\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452795 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-nb\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452820 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-credential-keys\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452846 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-config-data\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.452874 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-dns-svc\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.453819 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-dns-svc\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.454012 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-sb\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.454373 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-config\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.454833 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-nb\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.498115 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxbxd\" (UniqueName: \"kubernetes.io/projected/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-kube-api-access-bxbxd\") pod \"dnsmasq-dns-5d44dbddd5-7lq7z\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.515504 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86d6455c5c-px2dk"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.516725 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.520179 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.521350 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.523935 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-plsc2" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.527601 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.535595 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86d6455c5c-px2dk"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.554047 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6f6c\" (UniqueName: \"kubernetes.io/projected/ae3faaec-add1-43f6-95ad-7cc26517afc3-kube-api-access-d6f6c\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.554107 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-fernet-keys\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.554154 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-scripts\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.554182 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-combined-ca-bundle\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.554207 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-credential-keys\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.554227 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-config-data\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.569703 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-combined-ca-bundle\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.570042 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-fernet-keys\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.572879 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-scripts\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.581642 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.584123 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-credential-keys\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.585879 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-config-data\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.607104 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6f6c\" (UniqueName: \"kubernetes.io/projected/ae3faaec-add1-43f6-95ad-7cc26517afc3-kube-api-access-d6f6c\") pod \"keystone-bootstrap-bx29h\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.644937 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.646809 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.649624 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.654134 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.654377 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.655211 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d21dfc7-f906-45e6-99f3-bac942be8ed9-logs\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.655332 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-scripts\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.655465 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mctt\" (UniqueName: \"kubernetes.io/projected/8d21dfc7-f906-45e6-99f3-bac942be8ed9-kube-api-access-5mctt\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.655541 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d21dfc7-f906-45e6-99f3-bac942be8ed9-horizon-secret-key\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.655756 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-config-data\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.673689 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.685352 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d44dbddd5-7lq7z"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.705118 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-rjgqd"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.706430 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.709654 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gxrts" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.709859 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.709871 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.725389 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rjgqd"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.751969 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8df9b6575-h4v75"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.753330 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760154 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-config-data\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760364 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d21dfc7-f906-45e6-99f3-bac942be8ed9-logs\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760450 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-log-httpd\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760529 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760605 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhrzb\" (UniqueName: \"kubernetes.io/projected/dfde693e-68da-45d5-803a-c8a0e36a198a-kube-api-access-xhrzb\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760670 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-scripts\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760758 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-run-httpd\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760828 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mctt\" (UniqueName: \"kubernetes.io/projected/8d21dfc7-f906-45e6-99f3-bac942be8ed9-kube-api-access-5mctt\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760898 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d21dfc7-f906-45e6-99f3-bac942be8ed9-horizon-secret-key\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.760962 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-scripts\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.761032 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.761329 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-config-data\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.762524 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-config-data\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.762794 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d21dfc7-f906-45e6-99f3-bac942be8ed9-logs\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.763391 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-scripts\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.766115 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d21dfc7-f906-45e6-99f3-bac942be8ed9-horizon-secret-key\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.798656 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f8f5cc67-57p2t"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.800294 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.804795 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mctt\" (UniqueName: \"kubernetes.io/projected/8d21dfc7-f906-45e6-99f3-bac942be8ed9-kube-api-access-5mctt\") pod \"horizon-86d6455c5c-px2dk\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.826030 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8df9b6575-h4v75"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.845498 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.854709 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f8f5cc67-57p2t"] Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.862942 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-sb\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.862980 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-combined-ca-bundle\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.862998 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km94j\" (UniqueName: \"kubernetes.io/projected/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-kube-api-access-km94j\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863013 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njdl4\" (UniqueName: \"kubernetes.io/projected/d96f008c-e967-4142-a433-92edb4634097-kube-api-access-njdl4\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863096 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-config-data\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863116 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c49a442e-414a-4d5e-b45d-15e57570625e-logs\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863137 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-config-data\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863183 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-log-httpd\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863199 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-config-data\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863216 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863236 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-nb\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863253 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhrzb\" (UniqueName: \"kubernetes.io/projected/dfde693e-68da-45d5-803a-c8a0e36a198a-kube-api-access-xhrzb\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863273 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-scripts\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863294 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-scripts\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863333 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-dns-svc\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863373 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-run-httpd\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863397 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c49a442e-414a-4d5e-b45d-15e57570625e-horizon-secret-key\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863419 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-scripts\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863447 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863467 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96f008c-e967-4142-a433-92edb4634097-logs\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863484 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb8w8\" (UniqueName: \"kubernetes.io/projected/c49a442e-414a-4d5e-b45d-15e57570625e-kube-api-access-wb8w8\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.863501 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-config\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.871791 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-config-data\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.872457 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-run-httpd\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.873140 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-log-httpd\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.885683 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.891610 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-scripts\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.892390 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.900045 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhrzb\" (UniqueName: \"kubernetes.io/projected/dfde693e-68da-45d5-803a-c8a0e36a198a-kube-api-access-xhrzb\") pod \"ceilometer-0\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.965730 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-nb\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.965794 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-scripts\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.965830 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-scripts\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.965861 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-dns-svc\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.965911 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c49a442e-414a-4d5e-b45d-15e57570625e-horizon-secret-key\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.965976 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96f008c-e967-4142-a433-92edb4634097-logs\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.965998 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb8w8\" (UniqueName: \"kubernetes.io/projected/c49a442e-414a-4d5e-b45d-15e57570625e-kube-api-access-wb8w8\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.966020 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-config\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.966083 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-sb\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.966122 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-combined-ca-bundle\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.966139 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km94j\" (UniqueName: \"kubernetes.io/projected/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-kube-api-access-km94j\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.966160 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njdl4\" (UniqueName: \"kubernetes.io/projected/d96f008c-e967-4142-a433-92edb4634097-kube-api-access-njdl4\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.966211 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c49a442e-414a-4d5e-b45d-15e57570625e-logs\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.966232 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-config-data\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.967579 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-config-data\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.968966 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-config\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.968994 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-nb\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.969519 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-sb\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.969733 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-scripts\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.971291 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c49a442e-414a-4d5e-b45d-15e57570625e-logs\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.972033 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96f008c-e967-4142-a433-92edb4634097-logs\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.972543 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-config-data\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.973147 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-config-data\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.976393 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c49a442e-414a-4d5e-b45d-15e57570625e-horizon-secret-key\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.976688 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-dns-svc\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.977633 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-scripts\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.978056 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-combined-ca-bundle\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.985132 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km94j\" (UniqueName: \"kubernetes.io/projected/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-kube-api-access-km94j\") pod \"dnsmasq-dns-7f8f5cc67-57p2t\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.986238 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.991273 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb8w8\" (UniqueName: \"kubernetes.io/projected/c49a442e-414a-4d5e-b45d-15e57570625e-kube-api-access-wb8w8\") pod \"horizon-8df9b6575-h4v75\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:01 crc kubenswrapper[4834]: I1126 12:26:01.991287 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njdl4\" (UniqueName: \"kubernetes.io/projected/d96f008c-e967-4142-a433-92edb4634097-kube-api-access-njdl4\") pod \"placement-db-sync-rjgqd\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.093055 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.128596 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.136947 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.240257 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d44dbddd5-7lq7z"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.297876 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bx29h"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.334989 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6c9ml"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.336758 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.341106 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6p9qv" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.341436 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.358941 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6c9ml"] Nov 26 12:26:02 crc kubenswrapper[4834]: W1126 12:26:02.366735 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d21dfc7_f906_45e6_99f3_bac942be8ed9.slice/crio-f7b60804d3afcbafda6dc7d27f983da354074942aa94c41490d7e7bdb380725b WatchSource:0}: Error finding container f7b60804d3afcbafda6dc7d27f983da354074942aa94c41490d7e7bdb380725b: Status 404 returned error can't find the container with id f7b60804d3afcbafda6dc7d27f983da354074942aa94c41490d7e7bdb380725b Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.376897 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86d6455c5c-px2dk"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.470834 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.487767 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-db-sync-config-data\") pod \"barbican-db-sync-6c9ml\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.487821 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-combined-ca-bundle\") pod \"barbican-db-sync-6c9ml\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.487965 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244dd\" (UniqueName: \"kubernetes.io/projected/2eac617a-c5a8-44c6-b790-55ec23e59e5a-kube-api-access-244dd\") pod \"barbican-db-sync-6c9ml\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.490756 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-kj94c"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.494484 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.500230 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.500527 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x8jns" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.500751 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.537284 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-kj94c"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.563098 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7lbgp"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.564440 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.566250 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.566641 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xlj8v" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.566942 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.569978 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7lbgp"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.590548 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-scripts\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.590584 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ddc5896-100d-473a-9bed-a2e13560bc8e-etc-machine-id\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.590614 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-244dd\" (UniqueName: \"kubernetes.io/projected/2eac617a-c5a8-44c6-b790-55ec23e59e5a-kube-api-access-244dd\") pod \"barbican-db-sync-6c9ml\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.590659 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-combined-ca-bundle\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.590675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-db-sync-config-data\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.590713 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-config-data\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.590730 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn8sn\" (UniqueName: \"kubernetes.io/projected/0ddc5896-100d-473a-9bed-a2e13560bc8e-kube-api-access-gn8sn\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.590762 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-db-sync-config-data\") pod \"barbican-db-sync-6c9ml\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.590780 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-combined-ca-bundle\") pod \"barbican-db-sync-6c9ml\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.595569 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-db-sync-config-data\") pod \"barbican-db-sync-6c9ml\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.596018 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rjgqd"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.596093 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-combined-ca-bundle\") pod \"barbican-db-sync-6c9ml\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.605703 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-244dd\" (UniqueName: \"kubernetes.io/projected/2eac617a-c5a8-44c6-b790-55ec23e59e5a-kube-api-access-244dd\") pod \"barbican-db-sync-6c9ml\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.666633 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.677087 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8df9b6575-h4v75"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.691916 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-config-data\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.692040 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn8sn\" (UniqueName: \"kubernetes.io/projected/0ddc5896-100d-473a-9bed-a2e13560bc8e-kube-api-access-gn8sn\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.692237 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-scripts\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.692320 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ddc5896-100d-473a-9bed-a2e13560bc8e-etc-machine-id\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.693341 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-combined-ca-bundle\") pod \"neutron-db-sync-7lbgp\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.693480 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkm6h\" (UniqueName: \"kubernetes.io/projected/9f95bd75-d740-47da-9ff2-d13cd8914aa4-kube-api-access-hkm6h\") pod \"neutron-db-sync-7lbgp\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.693612 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-combined-ca-bundle\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.693680 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-db-sync-config-data\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.693774 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-config\") pod \"neutron-db-sync-7lbgp\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.693928 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ddc5896-100d-473a-9bed-a2e13560bc8e-etc-machine-id\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.694109 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f8f5cc67-57p2t"] Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.696980 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-config-data\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.697372 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-scripts\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.697894 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-combined-ca-bundle\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: W1126 12:26:02.697970 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc49a442e_414a_4d5e_b45d_15e57570625e.slice/crio-c8b9577ed2b6f0d7e93ee87e42d3db55016f6adb61dea0785ee6ec13b8f0f8ed WatchSource:0}: Error finding container c8b9577ed2b6f0d7e93ee87e42d3db55016f6adb61dea0785ee6ec13b8f0f8ed: Status 404 returned error can't find the container with id c8b9577ed2b6f0d7e93ee87e42d3db55016f6adb61dea0785ee6ec13b8f0f8ed Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.698595 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-db-sync-config-data\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.709281 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn8sn\" (UniqueName: \"kubernetes.io/projected/0ddc5896-100d-473a-9bed-a2e13560bc8e-kube-api-access-gn8sn\") pod \"cinder-db-sync-kj94c\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.795084 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-combined-ca-bundle\") pod \"neutron-db-sync-7lbgp\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.795336 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkm6h\" (UniqueName: \"kubernetes.io/projected/9f95bd75-d740-47da-9ff2-d13cd8914aa4-kube-api-access-hkm6h\") pod \"neutron-db-sync-7lbgp\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.795453 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-config\") pod \"neutron-db-sync-7lbgp\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.803105 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-config\") pod \"neutron-db-sync-7lbgp\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.804851 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-combined-ca-bundle\") pod \"neutron-db-sync-7lbgp\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.813165 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkm6h\" (UniqueName: \"kubernetes.io/projected/9f95bd75-d740-47da-9ff2-d13cd8914aa4-kube-api-access-hkm6h\") pod \"neutron-db-sync-7lbgp\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:02 crc kubenswrapper[4834]: I1126 12:26:02.984146 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.097555 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6c9ml"] Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.097607 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" event={"ID":"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b","Type":"ContainerDied","Data":"a111b98cccc63f8a7523f4bf1dd556be462668000e2c44c0e796cc3e5f80eb4e"} Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.097545 4834 generic.go:334] "Generic (PLEG): container finished" podID="f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" containerID="a111b98cccc63f8a7523f4bf1dd556be462668000e2c44c0e796cc3e5f80eb4e" exitCode=0 Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.097700 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" event={"ID":"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b","Type":"ContainerStarted","Data":"69bcdf971c3b10ee27098c1205da3992ddb5a295ee44c62273e80dda7e8bb519"} Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.106452 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rjgqd" event={"ID":"d96f008c-e967-4142-a433-92edb4634097","Type":"ContainerStarted","Data":"bd571e181e8eace8adef53449912ca16599fe771c679286c895383dc4b4e82b5"} Nov 26 12:26:03 crc kubenswrapper[4834]: W1126 12:26:03.106797 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eac617a_c5a8_44c6_b790_55ec23e59e5a.slice/crio-916cb4c1cfdbffa539a6f6f1e102fd18c97b455e097dcd7f37df260b35f36ced WatchSource:0}: Error finding container 916cb4c1cfdbffa539a6f6f1e102fd18c97b455e097dcd7f37df260b35f36ced: Status 404 returned error can't find the container with id 916cb4c1cfdbffa539a6f6f1e102fd18c97b455e097dcd7f37df260b35f36ced Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.110591 4834 generic.go:334] "Generic (PLEG): container finished" podID="b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" containerID="0f18ce8820e9af19d20945fe81a77333049388fc5955b37b76ad04b74d2f5d29" exitCode=0 Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.110636 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" event={"ID":"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5","Type":"ContainerDied","Data":"0f18ce8820e9af19d20945fe81a77333049388fc5955b37b76ad04b74d2f5d29"} Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.110653 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" event={"ID":"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5","Type":"ContainerStarted","Data":"0a4c3befced02872f064a52b951fd1acf9c163e214585503f364a039654c9386"} Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.114479 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.117066 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfde693e-68da-45d5-803a-c8a0e36a198a","Type":"ContainerStarted","Data":"ccb4280319de17ed85411472b25e27102b342c359fa2bedf9781b5c0f67ac1c7"} Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.133209 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bx29h" event={"ID":"ae3faaec-add1-43f6-95ad-7cc26517afc3","Type":"ContainerStarted","Data":"71ad9e10c017b08a6c8991a3f5e95d29784904a18ed6e5fd1eee7adf71954d08"} Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.133249 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bx29h" event={"ID":"ae3faaec-add1-43f6-95ad-7cc26517afc3","Type":"ContainerStarted","Data":"919018a8dc28ae39fa6678cf094feed3e05b30bb84a5870997b722f1b05c97e2"} Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.135744 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8df9b6575-h4v75" event={"ID":"c49a442e-414a-4d5e-b45d-15e57570625e","Type":"ContainerStarted","Data":"c8b9577ed2b6f0d7e93ee87e42d3db55016f6adb61dea0785ee6ec13b8f0f8ed"} Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.137302 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86d6455c5c-px2dk" event={"ID":"8d21dfc7-f906-45e6-99f3-bac942be8ed9","Type":"ContainerStarted","Data":"f7b60804d3afcbafda6dc7d27f983da354074942aa94c41490d7e7bdb380725b"} Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.168509 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bx29h" podStartSLOduration=2.168490372 podStartE2EDuration="2.168490372s" podCreationTimestamp="2025-11-26 12:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:03.157711363 +0000 UTC m=+861.064924715" watchObservedRunningTime="2025-11-26 12:26:03.168490372 +0000 UTC m=+861.075703724" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.396672 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86d6455c5c-px2dk"] Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.433505 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b64d5bd45-4kjkh"] Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.434830 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.450733 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-kj94c"] Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.513837 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b64d5bd45-4kjkh"] Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.528613 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.531254 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23fd02e3-eaab-4e25-b9f7-74bee38551e7-horizon-secret-key\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.531348 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fd02e3-eaab-4e25-b9f7-74bee38551e7-logs\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.531393 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-scripts\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.531427 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-config-data\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.531750 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nts6\" (UniqueName: \"kubernetes.io/projected/23fd02e3-eaab-4e25-b9f7-74bee38551e7-kube-api-access-2nts6\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.603862 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.633342 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nts6\" (UniqueName: \"kubernetes.io/projected/23fd02e3-eaab-4e25-b9f7-74bee38551e7-kube-api-access-2nts6\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.633454 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23fd02e3-eaab-4e25-b9f7-74bee38551e7-horizon-secret-key\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.633524 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fd02e3-eaab-4e25-b9f7-74bee38551e7-logs\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.633555 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-scripts\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.633596 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-config-data\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.633977 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fd02e3-eaab-4e25-b9f7-74bee38551e7-logs\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.634259 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-scripts\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.634977 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-config-data\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.644036 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7lbgp"] Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.647688 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23fd02e3-eaab-4e25-b9f7-74bee38551e7-horizon-secret-key\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.647711 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nts6\" (UniqueName: \"kubernetes.io/projected/23fd02e3-eaab-4e25-b9f7-74bee38551e7-kube-api-access-2nts6\") pod \"horizon-b64d5bd45-4kjkh\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:03 crc kubenswrapper[4834]: W1126 12:26:03.651066 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f95bd75_d740_47da_9ff2_d13cd8914aa4.slice/crio-0b27da36893db5e5036fb1ca9a136a379aeed43a94ffc3414abb39dd27513e5c WatchSource:0}: Error finding container 0b27da36893db5e5036fb1ca9a136a379aeed43a94ffc3414abb39dd27513e5c: Status 404 returned error can't find the container with id 0b27da36893db5e5036fb1ca9a136a379aeed43a94ffc3414abb39dd27513e5c Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.734813 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-config\") pod \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.735125 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-dns-svc\") pod \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.735179 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-sb\") pod \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.735259 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-nb\") pod \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.735391 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxbxd\" (UniqueName: \"kubernetes.io/projected/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-kube-api-access-bxbxd\") pod \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\" (UID: \"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b\") " Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.741815 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-kube-api-access-bxbxd" (OuterVolumeSpecName: "kube-api-access-bxbxd") pod "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" (UID: "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b"). InnerVolumeSpecName "kube-api-access-bxbxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.755696 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-config" (OuterVolumeSpecName: "config") pod "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" (UID: "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.755704 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" (UID: "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.756661 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" (UID: "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.761747 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" (UID: "f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.838378 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxbxd\" (UniqueName: \"kubernetes.io/projected/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-kube-api-access-bxbxd\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.838419 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.838428 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.838439 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.838448 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:03 crc kubenswrapper[4834]: I1126 12:26:03.897206 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.147398 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" event={"ID":"f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b","Type":"ContainerDied","Data":"69bcdf971c3b10ee27098c1205da3992ddb5a295ee44c62273e80dda7e8bb519"} Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.147694 4834 scope.go:117] "RemoveContainer" containerID="a111b98cccc63f8a7523f4bf1dd556be462668000e2c44c0e796cc3e5f80eb4e" Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.147445 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d44dbddd5-7lq7z" Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.154177 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" event={"ID":"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5","Type":"ContainerStarted","Data":"bef68840b906121a64833a59cb43e0eefbec5517babeed0093eb1b83d87e1219"} Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.154332 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.155571 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kj94c" event={"ID":"0ddc5896-100d-473a-9bed-a2e13560bc8e","Type":"ContainerStarted","Data":"1a536e397ec042d75db0e603a6432f8790589fbffd70e293d7157f7ad85826d2"} Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.157454 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7lbgp" event={"ID":"9f95bd75-d740-47da-9ff2-d13cd8914aa4","Type":"ContainerStarted","Data":"36dc29a3b278be0dc496b3e00caf0131be7692570972dc4ef80807eb90cf6b8b"} Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.157484 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7lbgp" event={"ID":"9f95bd75-d740-47da-9ff2-d13cd8914aa4","Type":"ContainerStarted","Data":"0b27da36893db5e5036fb1ca9a136a379aeed43a94ffc3414abb39dd27513e5c"} Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.159148 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6c9ml" event={"ID":"2eac617a-c5a8-44c6-b790-55ec23e59e5a","Type":"ContainerStarted","Data":"916cb4c1cfdbffa539a6f6f1e102fd18c97b455e097dcd7f37df260b35f36ced"} Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.175461 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" podStartSLOduration=3.175448077 podStartE2EDuration="3.175448077s" podCreationTimestamp="2025-11-26 12:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:04.169753664 +0000 UTC m=+862.076967016" watchObservedRunningTime="2025-11-26 12:26:04.175448077 +0000 UTC m=+862.082661429" Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.196018 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7lbgp" podStartSLOduration=2.196005203 podStartE2EDuration="2.196005203s" podCreationTimestamp="2025-11-26 12:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:04.187233643 +0000 UTC m=+862.094446995" watchObservedRunningTime="2025-11-26 12:26:04.196005203 +0000 UTC m=+862.103218554" Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.224838 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d44dbddd5-7lq7z"] Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.234828 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d44dbddd5-7lq7z"] Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.372408 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b64d5bd45-4kjkh"] Nov 26 12:26:04 crc kubenswrapper[4834]: W1126 12:26:04.374660 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23fd02e3_eaab_4e25_b9f7_74bee38551e7.slice/crio-d10f3a858c8af537e7a547adc7d536435f17a543ea71254c55f13f3b334d845c WatchSource:0}: Error finding container d10f3a858c8af537e7a547adc7d536435f17a543ea71254c55f13f3b334d845c: Status 404 returned error can't find the container with id d10f3a858c8af537e7a547adc7d536435f17a543ea71254c55f13f3b334d845c Nov 26 12:26:04 crc kubenswrapper[4834]: I1126 12:26:04.426449 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" path="/var/lib/kubelet/pods/f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b/volumes" Nov 26 12:26:05 crc kubenswrapper[4834]: I1126 12:26:05.170442 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b64d5bd45-4kjkh" event={"ID":"23fd02e3-eaab-4e25-b9f7-74bee38551e7","Type":"ContainerStarted","Data":"d10f3a858c8af537e7a547adc7d536435f17a543ea71254c55f13f3b334d845c"} Nov 26 12:26:06 crc kubenswrapper[4834]: I1126 12:26:06.185624 4834 generic.go:334] "Generic (PLEG): container finished" podID="ae3faaec-add1-43f6-95ad-7cc26517afc3" containerID="71ad9e10c017b08a6c8991a3f5e95d29784904a18ed6e5fd1eee7adf71954d08" exitCode=0 Nov 26 12:26:06 crc kubenswrapper[4834]: I1126 12:26:06.185670 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bx29h" event={"ID":"ae3faaec-add1-43f6-95ad-7cc26517afc3","Type":"ContainerDied","Data":"71ad9e10c017b08a6c8991a3f5e95d29784904a18ed6e5fd1eee7adf71954d08"} Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.317535 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.465966 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-config-data\") pod \"ae3faaec-add1-43f6-95ad-7cc26517afc3\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.466167 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-credential-keys\") pod \"ae3faaec-add1-43f6-95ad-7cc26517afc3\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.466983 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-fernet-keys\") pod \"ae3faaec-add1-43f6-95ad-7cc26517afc3\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.467072 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-scripts\") pod \"ae3faaec-add1-43f6-95ad-7cc26517afc3\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.467096 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6f6c\" (UniqueName: \"kubernetes.io/projected/ae3faaec-add1-43f6-95ad-7cc26517afc3-kube-api-access-d6f6c\") pod \"ae3faaec-add1-43f6-95ad-7cc26517afc3\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.467270 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-combined-ca-bundle\") pod \"ae3faaec-add1-43f6-95ad-7cc26517afc3\" (UID: \"ae3faaec-add1-43f6-95ad-7cc26517afc3\") " Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.473169 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ae3faaec-add1-43f6-95ad-7cc26517afc3" (UID: "ae3faaec-add1-43f6-95ad-7cc26517afc3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.473269 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-scripts" (OuterVolumeSpecName: "scripts") pod "ae3faaec-add1-43f6-95ad-7cc26517afc3" (UID: "ae3faaec-add1-43f6-95ad-7cc26517afc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.473508 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3faaec-add1-43f6-95ad-7cc26517afc3-kube-api-access-d6f6c" (OuterVolumeSpecName: "kube-api-access-d6f6c") pod "ae3faaec-add1-43f6-95ad-7cc26517afc3" (UID: "ae3faaec-add1-43f6-95ad-7cc26517afc3"). InnerVolumeSpecName "kube-api-access-d6f6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.473778 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ae3faaec-add1-43f6-95ad-7cc26517afc3" (UID: "ae3faaec-add1-43f6-95ad-7cc26517afc3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.491121 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-config-data" (OuterVolumeSpecName: "config-data") pod "ae3faaec-add1-43f6-95ad-7cc26517afc3" (UID: "ae3faaec-add1-43f6-95ad-7cc26517afc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.492562 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae3faaec-add1-43f6-95ad-7cc26517afc3" (UID: "ae3faaec-add1-43f6-95ad-7cc26517afc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.569606 4834 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.569658 4834 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.569667 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.569680 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6f6c\" (UniqueName: \"kubernetes.io/projected/ae3faaec-add1-43f6-95ad-7cc26517afc3-kube-api-access-d6f6c\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.569692 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.569699 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3faaec-add1-43f6-95ad-7cc26517afc3-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:09 crc kubenswrapper[4834]: I1126 12:26:09.995930 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8df9b6575-h4v75"] Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.021203 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64b956f958-2k5zd"] Nov 26 12:26:10 crc kubenswrapper[4834]: E1126 12:26:10.021564 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3faaec-add1-43f6-95ad-7cc26517afc3" containerName="keystone-bootstrap" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.021581 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3faaec-add1-43f6-95ad-7cc26517afc3" containerName="keystone-bootstrap" Nov 26 12:26:10 crc kubenswrapper[4834]: E1126 12:26:10.021615 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" containerName="init" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.021622 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" containerName="init" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.021746 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d8a22a-02df-4adc-a9b0-534b3d2a5f5b" containerName="init" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.021768 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3faaec-add1-43f6-95ad-7cc26517afc3" containerName="keystone-bootstrap" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.022508 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.025761 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.044829 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64b956f958-2k5zd"] Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.072016 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b64d5bd45-4kjkh"] Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.093433 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5895978f64-t9cvb"] Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.094907 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.108886 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5895978f64-t9cvb"] Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.182873 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-secret-key\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.182958 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9347056f-b194-406f-8b28-bd86ee220403-logs\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183008 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9347056f-b194-406f-8b28-bd86ee220403-horizon-secret-key\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183039 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkmq5\" (UniqueName: \"kubernetes.io/projected/9347056f-b194-406f-8b28-bd86ee220403-kube-api-access-lkmq5\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183070 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/4de7fa58-f514-49a8-88b0-205ef138c8a3-kube-api-access-gd9nn\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183124 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9347056f-b194-406f-8b28-bd86ee220403-config-data\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183157 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9347056f-b194-406f-8b28-bd86ee220403-combined-ca-bundle\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183203 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-scripts\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183235 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9347056f-b194-406f-8b28-bd86ee220403-horizon-tls-certs\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183285 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4de7fa58-f514-49a8-88b0-205ef138c8a3-logs\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183355 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-combined-ca-bundle\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183420 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-tls-certs\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183490 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-config-data\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.183538 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9347056f-b194-406f-8b28-bd86ee220403-scripts\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.226036 4834 generic.go:334] "Generic (PLEG): container finished" podID="9f95bd75-d740-47da-9ff2-d13cd8914aa4" containerID="36dc29a3b278be0dc496b3e00caf0131be7692570972dc4ef80807eb90cf6b8b" exitCode=0 Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.226125 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7lbgp" event={"ID":"9f95bd75-d740-47da-9ff2-d13cd8914aa4","Type":"ContainerDied","Data":"36dc29a3b278be0dc496b3e00caf0131be7692570972dc4ef80807eb90cf6b8b"} Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.228436 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bx29h" event={"ID":"ae3faaec-add1-43f6-95ad-7cc26517afc3","Type":"ContainerDied","Data":"919018a8dc28ae39fa6678cf094feed3e05b30bb84a5870997b722f1b05c97e2"} Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.228478 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="919018a8dc28ae39fa6678cf094feed3e05b30bb84a5870997b722f1b05c97e2" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.228539 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bx29h" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292479 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-combined-ca-bundle\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292577 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-tls-certs\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292632 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-config-data\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292675 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9347056f-b194-406f-8b28-bd86ee220403-scripts\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292704 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-secret-key\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292764 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9347056f-b194-406f-8b28-bd86ee220403-logs\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292794 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9347056f-b194-406f-8b28-bd86ee220403-horizon-secret-key\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292819 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkmq5\" (UniqueName: \"kubernetes.io/projected/9347056f-b194-406f-8b28-bd86ee220403-kube-api-access-lkmq5\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292841 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/4de7fa58-f514-49a8-88b0-205ef138c8a3-kube-api-access-gd9nn\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292885 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9347056f-b194-406f-8b28-bd86ee220403-config-data\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292908 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9347056f-b194-406f-8b28-bd86ee220403-combined-ca-bundle\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.292957 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-scripts\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.293021 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9347056f-b194-406f-8b28-bd86ee220403-horizon-tls-certs\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.293055 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4de7fa58-f514-49a8-88b0-205ef138c8a3-logs\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.293476 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4de7fa58-f514-49a8-88b0-205ef138c8a3-logs\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.294336 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9347056f-b194-406f-8b28-bd86ee220403-scripts\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.294903 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9347056f-b194-406f-8b28-bd86ee220403-config-data\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.296669 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-config-data\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.296925 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9347056f-b194-406f-8b28-bd86ee220403-logs\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.297280 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-combined-ca-bundle\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.297423 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-scripts\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.304844 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9347056f-b194-406f-8b28-bd86ee220403-horizon-secret-key\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.307379 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9347056f-b194-406f-8b28-bd86ee220403-combined-ca-bundle\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.315139 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-tls-certs\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.316393 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-secret-key\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.316557 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/4de7fa58-f514-49a8-88b0-205ef138c8a3-kube-api-access-gd9nn\") pod \"horizon-64b956f958-2k5zd\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.316604 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9347056f-b194-406f-8b28-bd86ee220403-horizon-tls-certs\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.318876 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkmq5\" (UniqueName: \"kubernetes.io/projected/9347056f-b194-406f-8b28-bd86ee220403-kube-api-access-lkmq5\") pod \"horizon-5895978f64-t9cvb\" (UID: \"9347056f-b194-406f-8b28-bd86ee220403\") " pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.350139 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.409998 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.440299 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bx29h"] Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.446330 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bx29h"] Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.546822 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lq9rd"] Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.547963 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.549700 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.549907 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.552667 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lq9rd"] Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.552876 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.552940 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8frzm" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.553425 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.700967 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-scripts\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.701005 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-config-data\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.701034 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-combined-ca-bundle\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.701062 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6dl5\" (UniqueName: \"kubernetes.io/projected/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-kube-api-access-c6dl5\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.701430 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-fernet-keys\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.701674 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-credential-keys\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.803471 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-scripts\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.803512 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-config-data\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.803537 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-combined-ca-bundle\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.803561 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6dl5\" (UniqueName: \"kubernetes.io/projected/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-kube-api-access-c6dl5\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.803640 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-fernet-keys\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.803713 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-credential-keys\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.807926 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-credential-keys\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.808032 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-combined-ca-bundle\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.808280 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-fernet-keys\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.808722 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-config-data\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.811709 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-scripts\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.818301 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6dl5\" (UniqueName: \"kubernetes.io/projected/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-kube-api-access-c6dl5\") pod \"keystone-bootstrap-lq9rd\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:10 crc kubenswrapper[4834]: I1126 12:26:10.871369 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:12 crc kubenswrapper[4834]: I1126 12:26:12.138498 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:12 crc kubenswrapper[4834]: I1126 12:26:12.198697 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b58765b5-2sx2l"] Nov 26 12:26:12 crc kubenswrapper[4834]: I1126 12:26:12.199293 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerName="dnsmasq-dns" containerID="cri-o://27bae8479412e3408c968208fbf63e1480b1083adf97a9c52ac6329d7364ed93" gracePeriod=10 Nov 26 12:26:12 crc kubenswrapper[4834]: I1126 12:26:12.452094 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae3faaec-add1-43f6-95ad-7cc26517afc3" path="/var/lib/kubelet/pods/ae3faaec-add1-43f6-95ad-7cc26517afc3/volumes" Nov 26 12:26:12 crc kubenswrapper[4834]: I1126 12:26:12.467521 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Nov 26 12:26:13 crc kubenswrapper[4834]: I1126 12:26:13.257100 4834 generic.go:334] "Generic (PLEG): container finished" podID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerID="27bae8479412e3408c968208fbf63e1480b1083adf97a9c52ac6329d7364ed93" exitCode=0 Nov 26 12:26:13 crc kubenswrapper[4834]: I1126 12:26:13.257156 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" event={"ID":"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5","Type":"ContainerDied","Data":"27bae8479412e3408c968208fbf63e1480b1083adf97a9c52ac6329d7364ed93"} Nov 26 12:26:15 crc kubenswrapper[4834]: E1126 12:26:15.872297 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057" Nov 26 12:26:15 crc kubenswrapper[4834]: E1126 12:26:15.872756 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n589h649h84h5bfh5b4h575h59bh554h5f6h66h596hbbh54dhc6h67fh5dh5b9h68ch694h74h56bhf8h9dh57fh679h549hdch597h57fh54h596h5c6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wb8w8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8df9b6575-h4v75_openstack(c49a442e-414a-4d5e-b45d-15e57570625e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 12:26:15 crc kubenswrapper[4834]: E1126 12:26:15.875275 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057\\\"\"]" pod="openstack/horizon-8df9b6575-h4v75" podUID="c49a442e-414a-4d5e-b45d-15e57570625e" Nov 26 12:26:16 crc kubenswrapper[4834]: E1126 12:26:16.086208 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:d375d370be5ead0dac71109af644849e5795f535f9ad8eeacea261d77ae6f140" Nov 26 12:26:16 crc kubenswrapper[4834]: E1126 12:26:16.086383 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:d375d370be5ead0dac71109af644849e5795f535f9ad8eeacea261d77ae6f140,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b7h5fdh659h589h669h5bfhd8hc7h556hc8h68bhfbh56h67h64chcbh667hcch7h6ch5dch79h584h5cbh56ch585hc4h55h6ch578hd5h649q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xhrzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(dfde693e-68da-45d5-803a-c8a0e36a198a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 12:26:17 crc kubenswrapper[4834]: E1126 12:26:17.112865 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:7dd2e0dbb6bb5a6cecd1763e43479ca8cb6a0c502534e83c8795c0da2b50e099" Nov 26 12:26:17 crc kubenswrapper[4834]: E1126 12:26:17.113274 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:7dd2e0dbb6bb5a6cecd1763e43479ca8cb6a0c502534e83c8795c0da2b50e099,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njdl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-rjgqd_openstack(d96f008c-e967-4142-a433-92edb4634097): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 12:26:17 crc kubenswrapper[4834]: E1126 12:26:17.114505 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-rjgqd" podUID="d96f008c-e967-4142-a433-92edb4634097" Nov 26 12:26:17 crc kubenswrapper[4834]: E1126 12:26:17.289967 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:7dd2e0dbb6bb5a6cecd1763e43479ca8cb6a0c502534e83c8795c0da2b50e099\\\"\"" pod="openstack/placement-db-sync-rjgqd" podUID="d96f008c-e967-4142-a433-92edb4634097" Nov 26 12:26:17 crc kubenswrapper[4834]: E1126 12:26:17.434479 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645" Nov 26 12:26:17 crc kubenswrapper[4834]: E1126 12:26:17.434616 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-244dd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-6c9ml_openstack(2eac617a-c5a8-44c6-b790-55ec23e59e5a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 12:26:17 crc kubenswrapper[4834]: E1126 12:26:17.435732 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-6c9ml" podUID="2eac617a-c5a8-44c6-b790-55ec23e59e5a" Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.466938 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.495763 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.543576 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-config\") pod \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.543670 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-combined-ca-bundle\") pod \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.543944 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkm6h\" (UniqueName: \"kubernetes.io/projected/9f95bd75-d740-47da-9ff2-d13cd8914aa4-kube-api-access-hkm6h\") pod \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\" (UID: \"9f95bd75-d740-47da-9ff2-d13cd8914aa4\") " Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.557833 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f95bd75-d740-47da-9ff2-d13cd8914aa4-kube-api-access-hkm6h" (OuterVolumeSpecName: "kube-api-access-hkm6h") pod "9f95bd75-d740-47da-9ff2-d13cd8914aa4" (UID: "9f95bd75-d740-47da-9ff2-d13cd8914aa4"). InnerVolumeSpecName "kube-api-access-hkm6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.565382 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f95bd75-d740-47da-9ff2-d13cd8914aa4" (UID: "9f95bd75-d740-47da-9ff2-d13cd8914aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.567917 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-config" (OuterVolumeSpecName: "config") pod "9f95bd75-d740-47da-9ff2-d13cd8914aa4" (UID: "9f95bd75-d740-47da-9ff2-d13cd8914aa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.647048 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkm6h\" (UniqueName: \"kubernetes.io/projected/9f95bd75-d740-47da-9ff2-d13cd8914aa4-kube-api-access-hkm6h\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.647285 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:17 crc kubenswrapper[4834]: I1126 12:26:17.647296 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f95bd75-d740-47da-9ff2-d13cd8914aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.296918 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7lbgp" event={"ID":"9f95bd75-d740-47da-9ff2-d13cd8914aa4","Type":"ContainerDied","Data":"0b27da36893db5e5036fb1ca9a136a379aeed43a94ffc3414abb39dd27513e5c"} Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.297029 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b27da36893db5e5036fb1ca9a136a379aeed43a94ffc3414abb39dd27513e5c" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.297208 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7lbgp" Nov 26 12:26:18 crc kubenswrapper[4834]: E1126 12:26:18.301098 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645\\\"\"" pod="openstack/barbican-db-sync-6c9ml" podUID="2eac617a-c5a8-44c6-b790-55ec23e59e5a" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.700015 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c65849c7f-n2g6v"] Nov 26 12:26:18 crc kubenswrapper[4834]: E1126 12:26:18.700368 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f95bd75-d740-47da-9ff2-d13cd8914aa4" containerName="neutron-db-sync" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.700387 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f95bd75-d740-47da-9ff2-d13cd8914aa4" containerName="neutron-db-sync" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.700555 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f95bd75-d740-47da-9ff2-d13cd8914aa4" containerName="neutron-db-sync" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.701337 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.709139 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c65849c7f-n2g6v"] Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.767973 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srzrm\" (UniqueName: \"kubernetes.io/projected/90315a1e-440b-414d-af26-a31c178faf53-kube-api-access-srzrm\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.768030 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-dns-svc\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.768086 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-sb\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.768400 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-config\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.768648 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-nb\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.871293 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srzrm\" (UniqueName: \"kubernetes.io/projected/90315a1e-440b-414d-af26-a31c178faf53-kube-api-access-srzrm\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.871441 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-dns-svc\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.871533 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-sb\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.871627 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-config\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.871695 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-nb\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.872518 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-sb\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.872603 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-config\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.873066 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-nb\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.873134 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-dns-svc\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.888443 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srzrm\" (UniqueName: \"kubernetes.io/projected/90315a1e-440b-414d-af26-a31c178faf53-kube-api-access-srzrm\") pod \"dnsmasq-dns-7c65849c7f-n2g6v\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.972406 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d947cb56d-gkmf6"] Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.974294 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.976283 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xlj8v" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.977282 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.977551 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.977887 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 26 12:26:18 crc kubenswrapper[4834]: I1126 12:26:18.979967 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d947cb56d-gkmf6"] Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.018911 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.077807 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-combined-ca-bundle\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.077886 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-ovndb-tls-certs\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.077912 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-httpd-config\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.078017 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-config\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.078239 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssdlp\" (UniqueName: \"kubernetes.io/projected/35f03545-c45a-4288-8877-d3bb326d6ab8-kube-api-access-ssdlp\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.179964 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-config\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.180090 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssdlp\" (UniqueName: \"kubernetes.io/projected/35f03545-c45a-4288-8877-d3bb326d6ab8-kube-api-access-ssdlp\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.180217 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-combined-ca-bundle\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.180294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-ovndb-tls-certs\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.180342 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-httpd-config\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.189079 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-ovndb-tls-certs\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.189446 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-httpd-config\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.189827 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-config\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.190255 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-combined-ca-bundle\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.195676 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssdlp\" (UniqueName: \"kubernetes.io/projected/35f03545-c45a-4288-8877-d3bb326d6ab8-kube-api-access-ssdlp\") pod \"neutron-7d947cb56d-gkmf6\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:19 crc kubenswrapper[4834]: I1126 12:26:19.291378 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.884384 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bcb769647-fxvwp"] Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.886074 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.888086 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.890100 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.900080 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bcb769647-fxvwp"] Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.916239 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-combined-ca-bundle\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.916282 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-internal-tls-certs\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.916348 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-ovndb-tls-certs\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.916378 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrlx8\" (UniqueName: \"kubernetes.io/projected/c2ac86af-2073-466e-ad2b-5203fbd8036f-kube-api-access-wrlx8\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.916415 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-public-tls-certs\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.916435 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-config\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:20 crc kubenswrapper[4834]: I1126 12:26:20.916611 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-httpd-config\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.017631 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-ovndb-tls-certs\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.017696 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrlx8\" (UniqueName: \"kubernetes.io/projected/c2ac86af-2073-466e-ad2b-5203fbd8036f-kube-api-access-wrlx8\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.017750 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-public-tls-certs\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.017771 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-config\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.017842 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-httpd-config\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.018090 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-combined-ca-bundle\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.018132 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-internal-tls-certs\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.025723 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-internal-tls-certs\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.026181 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-httpd-config\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.027696 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-combined-ca-bundle\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.028145 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-ovndb-tls-certs\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.032959 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-config\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.038449 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrlx8\" (UniqueName: \"kubernetes.io/projected/c2ac86af-2073-466e-ad2b-5203fbd8036f-kube-api-access-wrlx8\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.040233 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ac86af-2073-466e-ad2b-5203fbd8036f-public-tls-certs\") pod \"neutron-bcb769647-fxvwp\" (UID: \"c2ac86af-2073-466e-ad2b-5203fbd8036f\") " pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.208033 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.531187 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:26:21 crc kubenswrapper[4834]: I1126 12:26:21.531238 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.153949 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.164160 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.175930 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c49a442e-414a-4d5e-b45d-15e57570625e-logs\") pod \"c49a442e-414a-4d5e-b45d-15e57570625e\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.175996 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-nb\") pod \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.176049 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-dns-svc\") pod \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.176099 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-sb\") pod \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.176153 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-config\") pod \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.176169 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c49a442e-414a-4d5e-b45d-15e57570625e-horizon-secret-key\") pod \"c49a442e-414a-4d5e-b45d-15e57570625e\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.176208 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x75vl\" (UniqueName: \"kubernetes.io/projected/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-kube-api-access-x75vl\") pod \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\" (UID: \"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.176247 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-scripts\") pod \"c49a442e-414a-4d5e-b45d-15e57570625e\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.176274 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-config-data\") pod \"c49a442e-414a-4d5e-b45d-15e57570625e\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.176296 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb8w8\" (UniqueName: \"kubernetes.io/projected/c49a442e-414a-4d5e-b45d-15e57570625e-kube-api-access-wb8w8\") pod \"c49a442e-414a-4d5e-b45d-15e57570625e\" (UID: \"c49a442e-414a-4d5e-b45d-15e57570625e\") " Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.176949 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49a442e-414a-4d5e-b45d-15e57570625e-logs" (OuterVolumeSpecName: "logs") pod "c49a442e-414a-4d5e-b45d-15e57570625e" (UID: "c49a442e-414a-4d5e-b45d-15e57570625e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.177096 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-config-data" (OuterVolumeSpecName: "config-data") pod "c49a442e-414a-4d5e-b45d-15e57570625e" (UID: "c49a442e-414a-4d5e-b45d-15e57570625e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.177252 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-scripts" (OuterVolumeSpecName: "scripts") pod "c49a442e-414a-4d5e-b45d-15e57570625e" (UID: "c49a442e-414a-4d5e-b45d-15e57570625e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.177990 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.178012 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c49a442e-414a-4d5e-b45d-15e57570625e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.178022 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c49a442e-414a-4d5e-b45d-15e57570625e-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.181625 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-kube-api-access-x75vl" (OuterVolumeSpecName: "kube-api-access-x75vl") pod "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" (UID: "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5"). InnerVolumeSpecName "kube-api-access-x75vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.181762 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c49a442e-414a-4d5e-b45d-15e57570625e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c49a442e-414a-4d5e-b45d-15e57570625e" (UID: "c49a442e-414a-4d5e-b45d-15e57570625e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.182123 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49a442e-414a-4d5e-b45d-15e57570625e-kube-api-access-wb8w8" (OuterVolumeSpecName: "kube-api-access-wb8w8") pod "c49a442e-414a-4d5e-b45d-15e57570625e" (UID: "c49a442e-414a-4d5e-b45d-15e57570625e"). InnerVolumeSpecName "kube-api-access-wb8w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.217128 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-config" (OuterVolumeSpecName: "config") pod "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" (UID: "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.220719 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" (UID: "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.224838 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" (UID: "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.226994 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" (UID: "722cd8b2-0b69-4de7-8d3c-8f794dafb9a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.279554 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb8w8\" (UniqueName: \"kubernetes.io/projected/c49a442e-414a-4d5e-b45d-15e57570625e-kube-api-access-wb8w8\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.279584 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.279594 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.279605 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.279612 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.279620 4834 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c49a442e-414a-4d5e-b45d-15e57570625e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.279630 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x75vl\" (UniqueName: \"kubernetes.io/projected/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5-kube-api-access-x75vl\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.347462 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8df9b6575-h4v75" event={"ID":"c49a442e-414a-4d5e-b45d-15e57570625e","Type":"ContainerDied","Data":"c8b9577ed2b6f0d7e93ee87e42d3db55016f6adb61dea0785ee6ec13b8f0f8ed"} Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.347543 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8df9b6575-h4v75" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.353258 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" event={"ID":"722cd8b2-0b69-4de7-8d3c-8f794dafb9a5","Type":"ContainerDied","Data":"196f23abfc82092ea1b30a492cd702d20139b45e360c129f92a77134f1abdfb7"} Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.353283 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.353575 4834 scope.go:117] "RemoveContainer" containerID="27bae8479412e3408c968208fbf63e1480b1083adf97a9c52ac6329d7364ed93" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.403847 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8df9b6575-h4v75"] Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.409997 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8df9b6575-h4v75"] Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.432912 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c49a442e-414a-4d5e-b45d-15e57570625e" path="/var/lib/kubelet/pods/c49a442e-414a-4d5e-b45d-15e57570625e/volumes" Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.433371 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b58765b5-2sx2l"] Nov 26 12:26:24 crc kubenswrapper[4834]: I1126 12:26:24.433396 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75b58765b5-2sx2l"] Nov 26 12:26:25 crc kubenswrapper[4834]: E1126 12:26:25.029088 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879" Nov 26 12:26:25 crc kubenswrapper[4834]: E1126 12:26:25.029256 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn8sn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-kj94c_openstack(0ddc5896-100d-473a-9bed-a2e13560bc8e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 12:26:25 crc kubenswrapper[4834]: E1126 12:26:25.030536 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-kj94c" podUID="0ddc5896-100d-473a-9bed-a2e13560bc8e" Nov 26 12:26:25 crc kubenswrapper[4834]: I1126 12:26:25.294677 4834 scope.go:117] "RemoveContainer" containerID="5c52e0b9faa634b6d7453e8992ee6fbedf1d0af971e81430138d6b67979f4144" Nov 26 12:26:25 crc kubenswrapper[4834]: E1126 12:26:25.396053 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879\\\"\"" pod="openstack/cinder-db-sync-kj94c" podUID="0ddc5896-100d-473a-9bed-a2e13560bc8e" Nov 26 12:26:25 crc kubenswrapper[4834]: I1126 12:26:25.674958 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5895978f64-t9cvb"] Nov 26 12:26:25 crc kubenswrapper[4834]: I1126 12:26:25.758984 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lq9rd"] Nov 26 12:26:25 crc kubenswrapper[4834]: I1126 12:26:25.772109 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c65849c7f-n2g6v"] Nov 26 12:26:25 crc kubenswrapper[4834]: I1126 12:26:25.779372 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64b956f958-2k5zd"] Nov 26 12:26:25 crc kubenswrapper[4834]: W1126 12:26:25.780746 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90315a1e_440b_414d_af26_a31c178faf53.slice/crio-57977a60b69710b7a3f5f1baf1e488f82e7dc8ddd92adb307fe3b8e9a5030734 WatchSource:0}: Error finding container 57977a60b69710b7a3f5f1baf1e488f82e7dc8ddd92adb307fe3b8e9a5030734: Status 404 returned error can't find the container with id 57977a60b69710b7a3f5f1baf1e488f82e7dc8ddd92adb307fe3b8e9a5030734 Nov 26 12:26:25 crc kubenswrapper[4834]: I1126 12:26:25.965041 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bcb769647-fxvwp"] Nov 26 12:26:25 crc kubenswrapper[4834]: W1126 12:26:25.984458 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2ac86af_2073_466e_ad2b_5203fbd8036f.slice/crio-11d2fbfc61f766776ca83ff47b9de14fe6db94929495d8190bb16e8c721c4362 WatchSource:0}: Error finding container 11d2fbfc61f766776ca83ff47b9de14fe6db94929495d8190bb16e8c721c4362: Status 404 returned error can't find the container with id 11d2fbfc61f766776ca83ff47b9de14fe6db94929495d8190bb16e8c721c4362 Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.055057 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d947cb56d-gkmf6"] Nov 26 12:26:26 crc kubenswrapper[4834]: W1126 12:26:26.059696 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f03545_c45a_4288_8877_d3bb326d6ab8.slice/crio-44ff63c5dbfd5a524a153afc3989e0c34391df7235181d2c7431f3c0056d4fb3 WatchSource:0}: Error finding container 44ff63c5dbfd5a524a153afc3989e0c34391df7235181d2c7431f3c0056d4fb3: Status 404 returned error can't find the container with id 44ff63c5dbfd5a524a153afc3989e0c34391df7235181d2c7431f3c0056d4fb3 Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.403275 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b64d5bd45-4kjkh" event={"ID":"23fd02e3-eaab-4e25-b9f7-74bee38551e7","Type":"ContainerStarted","Data":"428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.403340 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b64d5bd45-4kjkh" event={"ID":"23fd02e3-eaab-4e25-b9f7-74bee38551e7","Type":"ContainerStarted","Data":"c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.403457 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b64d5bd45-4kjkh" podUID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerName="horizon-log" containerID="cri-o://c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87" gracePeriod=30 Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.404473 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b64d5bd45-4kjkh" podUID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerName="horizon" containerID="cri-o://428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36" gracePeriod=30 Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.410976 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64b956f958-2k5zd" event={"ID":"4de7fa58-f514-49a8-88b0-205ef138c8a3","Type":"ContainerStarted","Data":"9e4fcaac3592dd2cfcbe81a7173beea0340ff06e304762077d856046e28d7416"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.411010 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64b956f958-2k5zd" event={"ID":"4de7fa58-f514-49a8-88b0-205ef138c8a3","Type":"ContainerStarted","Data":"b1d2249bb255822f9a4b38553c1dc1bfbdc8ce15098e178e88a006df9ed337af"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.411022 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64b956f958-2k5zd" event={"ID":"4de7fa58-f514-49a8-88b0-205ef138c8a3","Type":"ContainerStarted","Data":"5b2750ed77556e5c8d1be1639927631aa7b031dfe843ffcb9e1e9d42cc4551f1"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.438302 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b64d5bd45-4kjkh" podStartSLOduration=2.811175312 podStartE2EDuration="23.43828349s" podCreationTimestamp="2025-11-26 12:26:03 +0000 UTC" firstStartedPulling="2025-11-26 12:26:04.37831882 +0000 UTC m=+862.285532172" lastFinishedPulling="2025-11-26 12:26:25.005426999 +0000 UTC m=+882.912640350" observedRunningTime="2025-11-26 12:26:26.421901419 +0000 UTC m=+884.329114770" watchObservedRunningTime="2025-11-26 12:26:26.43828349 +0000 UTC m=+884.345496843" Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.444583 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" path="/var/lib/kubelet/pods/722cd8b2-0b69-4de7-8d3c-8f794dafb9a5/volumes" Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.446289 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bcb769647-fxvwp" event={"ID":"c2ac86af-2073-466e-ad2b-5203fbd8036f","Type":"ContainerStarted","Data":"cc74f271d8408b975998fec7edb52560382c29eeb96257a04dc3e09d4fc12dde"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.446352 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bcb769647-fxvwp" event={"ID":"c2ac86af-2073-466e-ad2b-5203fbd8036f","Type":"ContainerStarted","Data":"11d2fbfc61f766776ca83ff47b9de14fe6db94929495d8190bb16e8c721c4362"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.446364 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lq9rd" event={"ID":"53beea6d-dc20-4423-b1eb-ff58ff4c2c69","Type":"ContainerStarted","Data":"a6dbc3a0a3a93678bfb749c6e279736cead11a4a8a7bc07dcef5a394ed4b721a"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.446376 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lq9rd" event={"ID":"53beea6d-dc20-4423-b1eb-ff58ff4c2c69","Type":"ContainerStarted","Data":"31216365e59c9ee804f27859e7cb9fb92649d03e4f40ddaf4950e8bb30612701"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.450173 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86d6455c5c-px2dk" event={"ID":"8d21dfc7-f906-45e6-99f3-bac942be8ed9","Type":"ContainerStarted","Data":"efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.450206 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86d6455c5c-px2dk" event={"ID":"8d21dfc7-f906-45e6-99f3-bac942be8ed9","Type":"ContainerStarted","Data":"350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.450372 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86d6455c5c-px2dk" podUID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerName="horizon-log" containerID="cri-o://350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16" gracePeriod=30 Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.450473 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86d6455c5c-px2dk" podUID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerName="horizon" containerID="cri-o://efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1" gracePeriod=30 Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.455388 4834 generic.go:334] "Generic (PLEG): container finished" podID="90315a1e-440b-414d-af26-a31c178faf53" containerID="1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8" exitCode=0 Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.455446 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" event={"ID":"90315a1e-440b-414d-af26-a31c178faf53","Type":"ContainerDied","Data":"1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.455469 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" event={"ID":"90315a1e-440b-414d-af26-a31c178faf53","Type":"ContainerStarted","Data":"57977a60b69710b7a3f5f1baf1e488f82e7dc8ddd92adb307fe3b8e9a5030734"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.457372 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64b956f958-2k5zd" podStartSLOduration=17.452291492 podStartE2EDuration="17.452291492s" podCreationTimestamp="2025-11-26 12:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:26.446063913 +0000 UTC m=+884.353277265" watchObservedRunningTime="2025-11-26 12:26:26.452291492 +0000 UTC m=+884.359504843" Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.468779 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lq9rd" podStartSLOduration=16.468761479 podStartE2EDuration="16.468761479s" podCreationTimestamp="2025-11-26 12:26:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:26.461905528 +0000 UTC m=+884.369118880" watchObservedRunningTime="2025-11-26 12:26:26.468761479 +0000 UTC m=+884.375974832" Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.474498 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5895978f64-t9cvb" event={"ID":"9347056f-b194-406f-8b28-bd86ee220403","Type":"ContainerStarted","Data":"3164da5842cad520bb85f50abf68afce691cbd11c13865686c3a31601c16275c"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.474761 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5895978f64-t9cvb" event={"ID":"9347056f-b194-406f-8b28-bd86ee220403","Type":"ContainerStarted","Data":"94f8b0fe481590b92306ac34baed5d5dafe25e357dd86caf3b603489246f5c19"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.474776 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5895978f64-t9cvb" event={"ID":"9347056f-b194-406f-8b28-bd86ee220403","Type":"ContainerStarted","Data":"3cfe56de8d8d74aa9cd8609932ef44a1351efd70a603443f4f810c6e86396164"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.485094 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfde693e-68da-45d5-803a-c8a0e36a198a","Type":"ContainerStarted","Data":"3cb9146b384da2fc97aabd6065964b95908546eb74a18b646ed19772282e5130"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.503528 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d947cb56d-gkmf6" event={"ID":"35f03545-c45a-4288-8877-d3bb326d6ab8","Type":"ContainerStarted","Data":"bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.503571 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d947cb56d-gkmf6" event={"ID":"35f03545-c45a-4288-8877-d3bb326d6ab8","Type":"ContainerStarted","Data":"44ff63c5dbfd5a524a153afc3989e0c34391df7235181d2c7431f3c0056d4fb3"} Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.504474 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.508303 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-86d6455c5c-px2dk" podStartSLOduration=2.614229334 podStartE2EDuration="25.508291796s" podCreationTimestamp="2025-11-26 12:26:01 +0000 UTC" firstStartedPulling="2025-11-26 12:26:02.369286297 +0000 UTC m=+860.276499648" lastFinishedPulling="2025-11-26 12:26:25.263348758 +0000 UTC m=+883.170562110" observedRunningTime="2025-11-26 12:26:26.504045428 +0000 UTC m=+884.411258781" watchObservedRunningTime="2025-11-26 12:26:26.508291796 +0000 UTC m=+884.415505148" Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.523353 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5895978f64-t9cvb" podStartSLOduration=16.52334231 podStartE2EDuration="16.52334231s" podCreationTimestamp="2025-11-26 12:26:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:26.518979895 +0000 UTC m=+884.426193247" watchObservedRunningTime="2025-11-26 12:26:26.52334231 +0000 UTC m=+884.430555662" Nov 26 12:26:26 crc kubenswrapper[4834]: I1126 12:26:26.566191 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d947cb56d-gkmf6" podStartSLOduration=8.566168904 podStartE2EDuration="8.566168904s" podCreationTimestamp="2025-11-26 12:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:26.534592929 +0000 UTC m=+884.441806281" watchObservedRunningTime="2025-11-26 12:26:26.566168904 +0000 UTC m=+884.473382257" Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.470445 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75b58765b5-2sx2l" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.519512 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" event={"ID":"90315a1e-440b-414d-af26-a31c178faf53","Type":"ContainerStarted","Data":"d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534"} Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.520029 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.521419 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/0.log" Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.522158 4834 generic.go:334] "Generic (PLEG): container finished" podID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerID="5686524be4daf095bbebf3a34f9b3d590e21125568fce8a97a203adf8e70d443" exitCode=1 Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.522572 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d947cb56d-gkmf6" event={"ID":"35f03545-c45a-4288-8877-d3bb326d6ab8","Type":"ContainerDied","Data":"5686524be4daf095bbebf3a34f9b3d590e21125568fce8a97a203adf8e70d443"} Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.523181 4834 scope.go:117] "RemoveContainer" containerID="5686524be4daf095bbebf3a34f9b3d590e21125568fce8a97a203adf8e70d443" Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.528213 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bcb769647-fxvwp" event={"ID":"c2ac86af-2073-466e-ad2b-5203fbd8036f","Type":"ContainerStarted","Data":"86106fac43c50eaf427af82b0bb5033c5764834e4abc69e3a83305823d3e0c47"} Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.529171 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.542801 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" podStartSLOduration=9.542778335 podStartE2EDuration="9.542778335s" podCreationTimestamp="2025-11-26 12:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:27.533850391 +0000 UTC m=+885.441063743" watchObservedRunningTime="2025-11-26 12:26:27.542778335 +0000 UTC m=+885.449991688" Nov 26 12:26:27 crc kubenswrapper[4834]: I1126 12:26:27.585656 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bcb769647-fxvwp" podStartSLOduration=7.585638393 podStartE2EDuration="7.585638393s" podCreationTimestamp="2025-11-26 12:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:27.579266954 +0000 UTC m=+885.486480306" watchObservedRunningTime="2025-11-26 12:26:27.585638393 +0000 UTC m=+885.492851745" Nov 26 12:26:28 crc kubenswrapper[4834]: I1126 12:26:28.538661 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/1.log" Nov 26 12:26:28 crc kubenswrapper[4834]: I1126 12:26:28.540070 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/0.log" Nov 26 12:26:28 crc kubenswrapper[4834]: I1126 12:26:28.540523 4834 generic.go:334] "Generic (PLEG): container finished" podID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerID="2c286212e07ffff368fe89eae09573970a75dc61ae582ec636aa7168d83f1c53" exitCode=1 Nov 26 12:26:28 crc kubenswrapper[4834]: I1126 12:26:28.540644 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d947cb56d-gkmf6" event={"ID":"35f03545-c45a-4288-8877-d3bb326d6ab8","Type":"ContainerDied","Data":"2c286212e07ffff368fe89eae09573970a75dc61ae582ec636aa7168d83f1c53"} Nov 26 12:26:28 crc kubenswrapper[4834]: I1126 12:26:28.541218 4834 scope.go:117] "RemoveContainer" containerID="2c286212e07ffff368fe89eae09573970a75dc61ae582ec636aa7168d83f1c53" Nov 26 12:26:28 crc kubenswrapper[4834]: E1126 12:26:28.541491 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-7d947cb56d-gkmf6_openstack(35f03545-c45a-4288-8877-d3bb326d6ab8)\"" pod="openstack/neutron-7d947cb56d-gkmf6" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" Nov 26 12:26:28 crc kubenswrapper[4834]: I1126 12:26:28.541719 4834 scope.go:117] "RemoveContainer" containerID="5686524be4daf095bbebf3a34f9b3d590e21125568fce8a97a203adf8e70d443" Nov 26 12:26:29 crc kubenswrapper[4834]: I1126 12:26:29.550053 4834 generic.go:334] "Generic (PLEG): container finished" podID="53beea6d-dc20-4423-b1eb-ff58ff4c2c69" containerID="a6dbc3a0a3a93678bfb749c6e279736cead11a4a8a7bc07dcef5a394ed4b721a" exitCode=0 Nov 26 12:26:29 crc kubenswrapper[4834]: I1126 12:26:29.550131 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lq9rd" event={"ID":"53beea6d-dc20-4423-b1eb-ff58ff4c2c69","Type":"ContainerDied","Data":"a6dbc3a0a3a93678bfb749c6e279736cead11a4a8a7bc07dcef5a394ed4b721a"} Nov 26 12:26:29 crc kubenswrapper[4834]: I1126 12:26:29.551040 4834 scope.go:117] "RemoveContainer" containerID="2c286212e07ffff368fe89eae09573970a75dc61ae582ec636aa7168d83f1c53" Nov 26 12:26:29 crc kubenswrapper[4834]: E1126 12:26:29.551350 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-7d947cb56d-gkmf6_openstack(35f03545-c45a-4288-8877-d3bb326d6ab8)\"" pod="openstack/neutron-7d947cb56d-gkmf6" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" Nov 26 12:26:30 crc kubenswrapper[4834]: I1126 12:26:30.351054 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:30 crc kubenswrapper[4834]: I1126 12:26:30.351124 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:30 crc kubenswrapper[4834]: I1126 12:26:30.411160 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:30 crc kubenswrapper[4834]: I1126 12:26:30.411204 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.572969 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lq9rd" event={"ID":"53beea6d-dc20-4423-b1eb-ff58ff4c2c69","Type":"ContainerDied","Data":"31216365e59c9ee804f27859e7cb9fb92649d03e4f40ddaf4950e8bb30612701"} Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.573462 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31216365e59c9ee804f27859e7cb9fb92649d03e4f40ddaf4950e8bb30612701" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.624269 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.665354 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6dl5\" (UniqueName: \"kubernetes.io/projected/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-kube-api-access-c6dl5\") pod \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.665421 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-combined-ca-bundle\") pod \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.665495 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-config-data\") pod \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.665569 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-credential-keys\") pod \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.665595 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-fernet-keys\") pod \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.665648 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-scripts\") pod \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\" (UID: \"53beea6d-dc20-4423-b1eb-ff58ff4c2c69\") " Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.674895 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-scripts" (OuterVolumeSpecName: "scripts") pod "53beea6d-dc20-4423-b1eb-ff58ff4c2c69" (UID: "53beea6d-dc20-4423-b1eb-ff58ff4c2c69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.675008 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "53beea6d-dc20-4423-b1eb-ff58ff4c2c69" (UID: "53beea6d-dc20-4423-b1eb-ff58ff4c2c69"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.678136 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-kube-api-access-c6dl5" (OuterVolumeSpecName: "kube-api-access-c6dl5") pod "53beea6d-dc20-4423-b1eb-ff58ff4c2c69" (UID: "53beea6d-dc20-4423-b1eb-ff58ff4c2c69"). InnerVolumeSpecName "kube-api-access-c6dl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.682425 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "53beea6d-dc20-4423-b1eb-ff58ff4c2c69" (UID: "53beea6d-dc20-4423-b1eb-ff58ff4c2c69"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.728408 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53beea6d-dc20-4423-b1eb-ff58ff4c2c69" (UID: "53beea6d-dc20-4423-b1eb-ff58ff4c2c69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.748064 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-config-data" (OuterVolumeSpecName: "config-data") pod "53beea6d-dc20-4423-b1eb-ff58ff4c2c69" (UID: "53beea6d-dc20-4423-b1eb-ff58ff4c2c69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.770546 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6dl5\" (UniqueName: \"kubernetes.io/projected/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-kube-api-access-c6dl5\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.770578 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.770588 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.770598 4834 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.770606 4834 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.770613 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53beea6d-dc20-4423-b1eb-ff58ff4c2c69-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:31 crc kubenswrapper[4834]: I1126 12:26:31.846551 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.589669 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6c9ml" event={"ID":"2eac617a-c5a8-44c6-b790-55ec23e59e5a","Type":"ContainerStarted","Data":"307f633476b01b2a97375072d9c897b59d86ac5e923e230b01a11b8413ad965d"} Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.591908 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rjgqd" event={"ID":"d96f008c-e967-4142-a433-92edb4634097","Type":"ContainerStarted","Data":"edf55806b92d1dad809694bfadccf8d9051ede3bfb157924b86fd74714adce77"} Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.599340 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfde693e-68da-45d5-803a-c8a0e36a198a","Type":"ContainerStarted","Data":"3e31d5168665dd1bbdc361ae4db6fe75589cb8a019033458c739cdca0e68ea85"} Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.600768 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/1.log" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.601121 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lq9rd" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.611874 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6c9ml" podStartSLOduration=2.381559763 podStartE2EDuration="30.611858719s" podCreationTimestamp="2025-11-26 12:26:02 +0000 UTC" firstStartedPulling="2025-11-26 12:26:03.146648999 +0000 UTC m=+861.053862351" lastFinishedPulling="2025-11-26 12:26:31.376947955 +0000 UTC m=+889.284161307" observedRunningTime="2025-11-26 12:26:32.608592239 +0000 UTC m=+890.515805591" watchObservedRunningTime="2025-11-26 12:26:32.611858719 +0000 UTC m=+890.519072072" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.622902 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-rjgqd" podStartSLOduration=2.867548126 podStartE2EDuration="31.622881019s" podCreationTimestamp="2025-11-26 12:26:01 +0000 UTC" firstStartedPulling="2025-11-26 12:26:02.641604358 +0000 UTC m=+860.548817710" lastFinishedPulling="2025-11-26 12:26:31.396937251 +0000 UTC m=+889.304150603" observedRunningTime="2025-11-26 12:26:32.621690266 +0000 UTC m=+890.528903618" watchObservedRunningTime="2025-11-26 12:26:32.622881019 +0000 UTC m=+890.530094371" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.712725 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-595cbdb8c4-fwh2n"] Nov 26 12:26:32 crc kubenswrapper[4834]: E1126 12:26:32.713216 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerName="dnsmasq-dns" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.713240 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerName="dnsmasq-dns" Nov 26 12:26:32 crc kubenswrapper[4834]: E1126 12:26:32.713254 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53beea6d-dc20-4423-b1eb-ff58ff4c2c69" containerName="keystone-bootstrap" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.713263 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="53beea6d-dc20-4423-b1eb-ff58ff4c2c69" containerName="keystone-bootstrap" Nov 26 12:26:32 crc kubenswrapper[4834]: E1126 12:26:32.713322 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerName="init" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.713329 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerName="init" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.713535 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="53beea6d-dc20-4423-b1eb-ff58ff4c2c69" containerName="keystone-bootstrap" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.713565 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="722cd8b2-0b69-4de7-8d3c-8f794dafb9a5" containerName="dnsmasq-dns" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.714326 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.718741 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.718962 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.719089 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.719216 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.719697 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8frzm" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.719830 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.728590 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-595cbdb8c4-fwh2n"] Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.792431 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-internal-tls-certs\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.792516 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-scripts\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.792586 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-fernet-keys\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.792696 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-config-data\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.792732 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-combined-ca-bundle\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.792766 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxjz9\" (UniqueName: \"kubernetes.io/projected/be2722bc-d66d-49d8-8966-769edf761453-kube-api-access-vxjz9\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.792969 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-public-tls-certs\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.793038 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-credential-keys\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.898752 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-credential-keys\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.898915 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-internal-tls-certs\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.898991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-scripts\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.899069 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-fernet-keys\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.899158 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-config-data\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.899185 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-combined-ca-bundle\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.899241 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxjz9\" (UniqueName: \"kubernetes.io/projected/be2722bc-d66d-49d8-8966-769edf761453-kube-api-access-vxjz9\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.899442 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-public-tls-certs\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.906172 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-internal-tls-certs\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.907070 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-scripts\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.909166 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-config-data\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.910003 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-public-tls-certs\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.910959 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-combined-ca-bundle\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.913426 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-fernet-keys\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.914473 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be2722bc-d66d-49d8-8966-769edf761453-credential-keys\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:32 crc kubenswrapper[4834]: I1126 12:26:32.918253 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxjz9\" (UniqueName: \"kubernetes.io/projected/be2722bc-d66d-49d8-8966-769edf761453-kube-api-access-vxjz9\") pod \"keystone-595cbdb8c4-fwh2n\" (UID: \"be2722bc-d66d-49d8-8966-769edf761453\") " pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:33 crc kubenswrapper[4834]: I1126 12:26:33.059524 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:33 crc kubenswrapper[4834]: I1126 12:26:33.532818 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-595cbdb8c4-fwh2n"] Nov 26 12:26:33 crc kubenswrapper[4834]: W1126 12:26:33.536347 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe2722bc_d66d_49d8_8966_769edf761453.slice/crio-54de4592f03ab505f1672d7bd28fac0c3c365e9e024a99197ea84504e293b8fb WatchSource:0}: Error finding container 54de4592f03ab505f1672d7bd28fac0c3c365e9e024a99197ea84504e293b8fb: Status 404 returned error can't find the container with id 54de4592f03ab505f1672d7bd28fac0c3c365e9e024a99197ea84504e293b8fb Nov 26 12:26:33 crc kubenswrapper[4834]: I1126 12:26:33.620637 4834 generic.go:334] "Generic (PLEG): container finished" podID="2eac617a-c5a8-44c6-b790-55ec23e59e5a" containerID="307f633476b01b2a97375072d9c897b59d86ac5e923e230b01a11b8413ad965d" exitCode=0 Nov 26 12:26:33 crc kubenswrapper[4834]: I1126 12:26:33.620669 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6c9ml" event={"ID":"2eac617a-c5a8-44c6-b790-55ec23e59e5a","Type":"ContainerDied","Data":"307f633476b01b2a97375072d9c897b59d86ac5e923e230b01a11b8413ad965d"} Nov 26 12:26:33 crc kubenswrapper[4834]: I1126 12:26:33.623117 4834 generic.go:334] "Generic (PLEG): container finished" podID="d96f008c-e967-4142-a433-92edb4634097" containerID="edf55806b92d1dad809694bfadccf8d9051ede3bfb157924b86fd74714adce77" exitCode=0 Nov 26 12:26:33 crc kubenswrapper[4834]: I1126 12:26:33.623179 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rjgqd" event={"ID":"d96f008c-e967-4142-a433-92edb4634097","Type":"ContainerDied","Data":"edf55806b92d1dad809694bfadccf8d9051ede3bfb157924b86fd74714adce77"} Nov 26 12:26:33 crc kubenswrapper[4834]: I1126 12:26:33.624905 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-595cbdb8c4-fwh2n" event={"ID":"be2722bc-d66d-49d8-8966-769edf761453","Type":"ContainerStarted","Data":"54de4592f03ab505f1672d7bd28fac0c3c365e9e024a99197ea84504e293b8fb"} Nov 26 12:26:33 crc kubenswrapper[4834]: I1126 12:26:33.898306 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:34 crc kubenswrapper[4834]: I1126 12:26:34.021613 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:26:34 crc kubenswrapper[4834]: I1126 12:26:34.116177 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f8f5cc67-57p2t"] Nov 26 12:26:34 crc kubenswrapper[4834]: I1126 12:26:34.116402 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" podUID="b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" containerName="dnsmasq-dns" containerID="cri-o://bef68840b906121a64833a59cb43e0eefbec5517babeed0093eb1b83d87e1219" gracePeriod=10 Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.641153 4834 generic.go:334] "Generic (PLEG): container finished" podID="b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" containerID="bef68840b906121a64833a59cb43e0eefbec5517babeed0093eb1b83d87e1219" exitCode=0 Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.641334 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" event={"ID":"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5","Type":"ContainerDied","Data":"bef68840b906121a64833a59cb43e0eefbec5517babeed0093eb1b83d87e1219"} Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.656434 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-595cbdb8c4-fwh2n" event={"ID":"be2722bc-d66d-49d8-8966-769edf761453","Type":"ContainerStarted","Data":"4d4184f07ddf99e161b7dff4ab5054fe996831bea9635b8f233aa943f69485b1"} Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.656696 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.677478 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-595cbdb8c4-fwh2n" podStartSLOduration=2.677458095 podStartE2EDuration="2.677458095s" podCreationTimestamp="2025-11-26 12:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:34.675605787 +0000 UTC m=+892.582819139" watchObservedRunningTime="2025-11-26 12:26:34.677458095 +0000 UTC m=+892.584671447" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.775958 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.849394 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-dns-svc\") pod \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.849484 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-nb\") pod \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.849605 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-sb\") pod \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.849659 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-config\") pod \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.849817 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km94j\" (UniqueName: \"kubernetes.io/projected/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-kube-api-access-km94j\") pod \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\" (UID: \"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.864124 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-kube-api-access-km94j" (OuterVolumeSpecName: "kube-api-access-km94j") pod "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" (UID: "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5"). InnerVolumeSpecName "kube-api-access-km94j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.901741 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" (UID: "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.910589 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" (UID: "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.911051 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-config" (OuterVolumeSpecName: "config") pod "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" (UID: "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.939088 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" (UID: "b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.955181 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.955202 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.955336 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km94j\" (UniqueName: \"kubernetes.io/projected/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-kube-api-access-km94j\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.955349 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:34.955367 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.403787 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.470076 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96f008c-e967-4142-a433-92edb4634097-logs\") pod \"d96f008c-e967-4142-a433-92edb4634097\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.470129 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-config-data\") pod \"d96f008c-e967-4142-a433-92edb4634097\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.470216 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njdl4\" (UniqueName: \"kubernetes.io/projected/d96f008c-e967-4142-a433-92edb4634097-kube-api-access-njdl4\") pod \"d96f008c-e967-4142-a433-92edb4634097\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.470235 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-combined-ca-bundle\") pod \"d96f008c-e967-4142-a433-92edb4634097\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.470284 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-scripts\") pod \"d96f008c-e967-4142-a433-92edb4634097\" (UID: \"d96f008c-e967-4142-a433-92edb4634097\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.471932 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d96f008c-e967-4142-a433-92edb4634097-logs" (OuterVolumeSpecName: "logs") pod "d96f008c-e967-4142-a433-92edb4634097" (UID: "d96f008c-e967-4142-a433-92edb4634097"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.474372 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-scripts" (OuterVolumeSpecName: "scripts") pod "d96f008c-e967-4142-a433-92edb4634097" (UID: "d96f008c-e967-4142-a433-92edb4634097"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.487435 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96f008c-e967-4142-a433-92edb4634097-kube-api-access-njdl4" (OuterVolumeSpecName: "kube-api-access-njdl4") pod "d96f008c-e967-4142-a433-92edb4634097" (UID: "d96f008c-e967-4142-a433-92edb4634097"). InnerVolumeSpecName "kube-api-access-njdl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.503804 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-config-data" (OuterVolumeSpecName: "config-data") pod "d96f008c-e967-4142-a433-92edb4634097" (UID: "d96f008c-e967-4142-a433-92edb4634097"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.513227 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d96f008c-e967-4142-a433-92edb4634097" (UID: "d96f008c-e967-4142-a433-92edb4634097"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.517614 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.571877 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-combined-ca-bundle\") pod \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.571956 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-244dd\" (UniqueName: \"kubernetes.io/projected/2eac617a-c5a8-44c6-b790-55ec23e59e5a-kube-api-access-244dd\") pod \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.572080 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-db-sync-config-data\") pod \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\" (UID: \"2eac617a-c5a8-44c6-b790-55ec23e59e5a\") " Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.572569 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njdl4\" (UniqueName: \"kubernetes.io/projected/d96f008c-e967-4142-a433-92edb4634097-kube-api-access-njdl4\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.572590 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.572600 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.572609 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96f008c-e967-4142-a433-92edb4634097-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.572617 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96f008c-e967-4142-a433-92edb4634097-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.576587 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2eac617a-c5a8-44c6-b790-55ec23e59e5a" (UID: "2eac617a-c5a8-44c6-b790-55ec23e59e5a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.576894 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eac617a-c5a8-44c6-b790-55ec23e59e5a-kube-api-access-244dd" (OuterVolumeSpecName: "kube-api-access-244dd") pod "2eac617a-c5a8-44c6-b790-55ec23e59e5a" (UID: "2eac617a-c5a8-44c6-b790-55ec23e59e5a"). InnerVolumeSpecName "kube-api-access-244dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.592376 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eac617a-c5a8-44c6-b790-55ec23e59e5a" (UID: "2eac617a-c5a8-44c6-b790-55ec23e59e5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.675107 4834 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.675149 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eac617a-c5a8-44c6-b790-55ec23e59e5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.675161 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-244dd\" (UniqueName: \"kubernetes.io/projected/2eac617a-c5a8-44c6-b790-55ec23e59e5a-kube-api-access-244dd\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.683941 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6c9ml" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.684553 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6c9ml" event={"ID":"2eac617a-c5a8-44c6-b790-55ec23e59e5a","Type":"ContainerDied","Data":"916cb4c1cfdbffa539a6f6f1e102fd18c97b455e097dcd7f37df260b35f36ced"} Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.684589 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="916cb4c1cfdbffa539a6f6f1e102fd18c97b455e097dcd7f37df260b35f36ced" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.690124 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rjgqd" event={"ID":"d96f008c-e967-4142-a433-92edb4634097","Type":"ContainerDied","Data":"bd571e181e8eace8adef53449912ca16599fe771c679286c895383dc4b4e82b5"} Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.690174 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd571e181e8eace8adef53449912ca16599fe771c679286c895383dc4b4e82b5" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.690194 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rjgqd" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.696240 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.696884 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f8f5cc67-57p2t" event={"ID":"b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5","Type":"ContainerDied","Data":"0a4c3befced02872f064a52b951fd1acf9c163e214585503f364a039654c9386"} Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.696938 4834 scope.go:117] "RemoveContainer" containerID="bef68840b906121a64833a59cb43e0eefbec5517babeed0093eb1b83d87e1219" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.787245 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f8f5cc67-57p2t"] Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.817277 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f8f5cc67-57p2t"] Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.827250 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7fc459bb58-nxgpv"] Nov 26 12:26:35 crc kubenswrapper[4834]: E1126 12:26:35.827828 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" containerName="dnsmasq-dns" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.827854 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" containerName="dnsmasq-dns" Nov 26 12:26:35 crc kubenswrapper[4834]: E1126 12:26:35.827862 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eac617a-c5a8-44c6-b790-55ec23e59e5a" containerName="barbican-db-sync" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.827870 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eac617a-c5a8-44c6-b790-55ec23e59e5a" containerName="barbican-db-sync" Nov 26 12:26:35 crc kubenswrapper[4834]: E1126 12:26:35.827883 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96f008c-e967-4142-a433-92edb4634097" containerName="placement-db-sync" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.827889 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96f008c-e967-4142-a433-92edb4634097" containerName="placement-db-sync" Nov 26 12:26:35 crc kubenswrapper[4834]: E1126 12:26:35.827926 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" containerName="init" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.827933 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" containerName="init" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.828118 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" containerName="dnsmasq-dns" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.828149 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eac617a-c5a8-44c6-b790-55ec23e59e5a" containerName="barbican-db-sync" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.828165 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96f008c-e967-4142-a433-92edb4634097" containerName="placement-db-sync" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.829448 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.833559 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.833776 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.836645 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7fc459bb58-nxgpv"] Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.841251 4834 scope.go:117] "RemoveContainer" containerID="0f18ce8820e9af19d20945fe81a77333049388fc5955b37b76ad04b74d2f5d29" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.846556 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6p9qv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.846849 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7776d59f89-jv7g8"] Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.848516 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.859050 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7776d59f89-jv7g8"] Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.866747 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8685c85bd8-7kkqj"] Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.868369 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.877688 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.877740 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.878062 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.878100 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gxrts" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.878240 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.891760 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-config-data\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.891809 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-scripts\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.891845 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmg5\" (UniqueName: \"kubernetes.io/projected/7889c8e4-2ada-4349-bf82-f177d34a3ad7-kube-api-access-brmg5\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.891872 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slm8q\" (UniqueName: \"kubernetes.io/projected/6dd7a932-530b-46cd-bfd6-d26679708721-kube-api-access-slm8q\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.891896 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7889c8e4-2ada-4349-bf82-f177d34a3ad7-logs\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.891915 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd7a932-530b-46cd-bfd6-d26679708721-combined-ca-bundle\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.891945 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-public-tls-certs\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.891979 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-sb\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.892004 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-combined-ca-bundle\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.892023 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twgrn\" (UniqueName: \"kubernetes.io/projected/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-kube-api-access-twgrn\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.892049 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd7a932-530b-46cd-bfd6-d26679708721-config-data-custom\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.892076 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-config\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.892103 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd7a932-530b-46cd-bfd6-d26679708721-config-data\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.892122 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-dns-svc\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.892249 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd7a932-530b-46cd-bfd6-d26679708721-logs\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.892267 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-internal-tls-certs\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.892290 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-nb\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.901099 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8685c85bd8-7kkqj"] Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.949377 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7459d949b5-hx42r"] Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.958422 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.965544 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.984383 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7459d949b5-hx42r"] Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993477 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd7a932-530b-46cd-bfd6-d26679708721-logs\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993507 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-internal-tls-certs\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993531 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d471582-dd4e-4b07-b99c-16196de70224-config-data\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993550 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdb74\" (UniqueName: \"kubernetes.io/projected/1d471582-dd4e-4b07-b99c-16196de70224-kube-api-access-bdb74\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993573 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-nb\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993589 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-config-data\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993608 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d471582-dd4e-4b07-b99c-16196de70224-combined-ca-bundle\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993624 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-scripts\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993647 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brmg5\" (UniqueName: \"kubernetes.io/projected/7889c8e4-2ada-4349-bf82-f177d34a3ad7-kube-api-access-brmg5\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993668 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slm8q\" (UniqueName: \"kubernetes.io/projected/6dd7a932-530b-46cd-bfd6-d26679708721-kube-api-access-slm8q\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993685 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7889c8e4-2ada-4349-bf82-f177d34a3ad7-logs\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993706 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd7a932-530b-46cd-bfd6-d26679708721-combined-ca-bundle\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993731 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-public-tls-certs\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993767 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-sb\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993792 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-combined-ca-bundle\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993809 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twgrn\" (UniqueName: \"kubernetes.io/projected/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-kube-api-access-twgrn\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993829 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d471582-dd4e-4b07-b99c-16196de70224-logs\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993845 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd7a932-530b-46cd-bfd6-d26679708721-config-data-custom\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993870 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-config\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993894 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd7a932-530b-46cd-bfd6-d26679708721-config-data\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993910 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-dns-svc\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.993926 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d471582-dd4e-4b07-b99c-16196de70224-config-data-custom\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.994225 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd7a932-530b-46cd-bfd6-d26679708721-logs\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:35 crc kubenswrapper[4834]: I1126 12:26:35.995676 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-config\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:35.999750 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-nb\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.000976 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-sb\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.001510 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-dns-svc\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.002252 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7889c8e4-2ada-4349-bf82-f177d34a3ad7-logs\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.002825 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-internal-tls-certs\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.003012 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd7a932-530b-46cd-bfd6-d26679708721-config-data-custom\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.004601 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-combined-ca-bundle\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.006258 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-public-tls-certs\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.013204 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd7a932-530b-46cd-bfd6-d26679708721-config-data\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.013417 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-scripts\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.015150 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd7a932-530b-46cd-bfd6-d26679708721-combined-ca-bundle\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.022426 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7889c8e4-2ada-4349-bf82-f177d34a3ad7-config-data\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.026570 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmg5\" (UniqueName: \"kubernetes.io/projected/7889c8e4-2ada-4349-bf82-f177d34a3ad7-kube-api-access-brmg5\") pod \"placement-8685c85bd8-7kkqj\" (UID: \"7889c8e4-2ada-4349-bf82-f177d34a3ad7\") " pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.026808 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slm8q\" (UniqueName: \"kubernetes.io/projected/6dd7a932-530b-46cd-bfd6-d26679708721-kube-api-access-slm8q\") pod \"barbican-keystone-listener-7fc459bb58-nxgpv\" (UID: \"6dd7a932-530b-46cd-bfd6-d26679708721\") " pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.029556 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6d49945c76-8p2wx"] Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.030889 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.032581 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.037324 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twgrn\" (UniqueName: \"kubernetes.io/projected/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-kube-api-access-twgrn\") pod \"dnsmasq-dns-7776d59f89-jv7g8\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.045183 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d49945c76-8p2wx"] Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.095689 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d471582-dd4e-4b07-b99c-16196de70224-logs\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.095811 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data-custom\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.095893 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6pvw\" (UniqueName: \"kubernetes.io/projected/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-kube-api-access-q6pvw\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.095968 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d471582-dd4e-4b07-b99c-16196de70224-config-data-custom\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.096036 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d471582-dd4e-4b07-b99c-16196de70224-config-data\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.096093 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-combined-ca-bundle\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.096151 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdb74\" (UniqueName: \"kubernetes.io/projected/1d471582-dd4e-4b07-b99c-16196de70224-kube-api-access-bdb74\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.096217 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d471582-dd4e-4b07-b99c-16196de70224-combined-ca-bundle\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.096278 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-logs\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.096407 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.099222 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d471582-dd4e-4b07-b99c-16196de70224-logs\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.101958 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d471582-dd4e-4b07-b99c-16196de70224-config-data-custom\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.105263 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d471582-dd4e-4b07-b99c-16196de70224-config-data\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.109448 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d471582-dd4e-4b07-b99c-16196de70224-combined-ca-bundle\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.112725 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdb74\" (UniqueName: \"kubernetes.io/projected/1d471582-dd4e-4b07-b99c-16196de70224-kube-api-access-bdb74\") pod \"barbican-worker-7459d949b5-hx42r\" (UID: \"1d471582-dd4e-4b07-b99c-16196de70224\") " pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.199420 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data-custom\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.199531 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6pvw\" (UniqueName: \"kubernetes.io/projected/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-kube-api-access-q6pvw\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.199621 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-combined-ca-bundle\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.199690 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-logs\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.199774 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.200082 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-logs\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.202763 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data-custom\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.203410 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-combined-ca-bundle\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.203976 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.217559 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6pvw\" (UniqueName: \"kubernetes.io/projected/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-kube-api-access-q6pvw\") pod \"barbican-api-6d49945c76-8p2wx\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.221534 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.235815 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.241712 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.283032 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7459d949b5-hx42r" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.390993 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.426873 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5" path="/var/lib/kubelet/pods/b7d0c34c-ab0c-4a01-ba4e-5b9e31b0a2d5/volumes" Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.700177 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7fc459bb58-nxgpv"] Nov 26 12:26:36 crc kubenswrapper[4834]: W1126 12:26:36.714619 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dd7a932_530b_46cd_bfd6_d26679708721.slice/crio-da350891b5f78efca6ea091a688db765fa07dfad8aca3250387959bb0f998f33 WatchSource:0}: Error finding container da350891b5f78efca6ea091a688db765fa07dfad8aca3250387959bb0f998f33: Status 404 returned error can't find the container with id da350891b5f78efca6ea091a688db765fa07dfad8aca3250387959bb0f998f33 Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.823972 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7459d949b5-hx42r"] Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.832728 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7776d59f89-jv7g8"] Nov 26 12:26:36 crc kubenswrapper[4834]: I1126 12:26:36.971803 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8685c85bd8-7kkqj"] Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.145783 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d49945c76-8p2wx"] Nov 26 12:26:37 crc kubenswrapper[4834]: W1126 12:26:37.194222 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9bdd80a_b1f9_42a7_8a60_4b514c778d4b.slice/crio-f74a87f1cecc4623678dccd7290ad1495faa6e21b97a730dfb96eead82927acc WatchSource:0}: Error finding container f74a87f1cecc4623678dccd7290ad1495faa6e21b97a730dfb96eead82927acc: Status 404 returned error can't find the container with id f74a87f1cecc4623678dccd7290ad1495faa6e21b97a730dfb96eead82927acc Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.744656 4834 generic.go:334] "Generic (PLEG): container finished" podID="e8c66c5d-0081-4f94-8dd5-14489e07cdfd" containerID="7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f" exitCode=0 Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.744736 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" event={"ID":"e8c66c5d-0081-4f94-8dd5-14489e07cdfd","Type":"ContainerDied","Data":"7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.744771 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" event={"ID":"e8c66c5d-0081-4f94-8dd5-14489e07cdfd","Type":"ContainerStarted","Data":"52e69d4b39894fc868d122406c6b22a54343b5547d5600a7ebbacdd2837dbbe8"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.751400 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" event={"ID":"6dd7a932-530b-46cd-bfd6-d26679708721","Type":"ContainerStarted","Data":"da350891b5f78efca6ea091a688db765fa07dfad8aca3250387959bb0f998f33"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.769193 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8685c85bd8-7kkqj" event={"ID":"7889c8e4-2ada-4349-bf82-f177d34a3ad7","Type":"ContainerStarted","Data":"8215e842149d84d6e03d8d1e4c26b8fc6f885bbc8fdb9461d1364810b5990f86"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.769243 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8685c85bd8-7kkqj" event={"ID":"7889c8e4-2ada-4349-bf82-f177d34a3ad7","Type":"ContainerStarted","Data":"2b10ed2b3f27bf88950ad8daa09972cb3122e587d5f1504d1584e9d634b9f325"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.769255 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8685c85bd8-7kkqj" event={"ID":"7889c8e4-2ada-4349-bf82-f177d34a3ad7","Type":"ContainerStarted","Data":"6aa5b5836fe3948823840f5bd2a72e8c273a82cbd699532e9331fac3b0e6192d"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.770805 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.770852 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.816394 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d49945c76-8p2wx" event={"ID":"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b","Type":"ContainerStarted","Data":"3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.816530 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d49945c76-8p2wx" event={"ID":"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b","Type":"ContainerStarted","Data":"1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.816560 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d49945c76-8p2wx" event={"ID":"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b","Type":"ContainerStarted","Data":"f74a87f1cecc4623678dccd7290ad1495faa6e21b97a730dfb96eead82927acc"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.818649 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.818662 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.824532 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8685c85bd8-7kkqj" podStartSLOduration=2.824509729 podStartE2EDuration="2.824509729s" podCreationTimestamp="2025-11-26 12:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:37.792286834 +0000 UTC m=+895.699500186" watchObservedRunningTime="2025-11-26 12:26:37.824509729 +0000 UTC m=+895.731723081" Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.856775 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7459d949b5-hx42r" event={"ID":"1d471582-dd4e-4b07-b99c-16196de70224","Type":"ContainerStarted","Data":"16f09166e5fde7bb4406f68dbcf84605cb7acf05eec2f3c539483c43e4f49574"} Nov 26 12:26:37 crc kubenswrapper[4834]: I1126 12:26:37.886429 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6d49945c76-8p2wx" podStartSLOduration=2.886411207 podStartE2EDuration="2.886411207s" podCreationTimestamp="2025-11-26 12:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:37.856722635 +0000 UTC m=+895.763935987" watchObservedRunningTime="2025-11-26 12:26:37.886411207 +0000 UTC m=+895.793624559" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.097454 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-78b8cf7fb4-c2jnd"] Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.100966 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.105283 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.109738 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.114874 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78b8cf7fb4-c2jnd"] Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.183189 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-internal-tls-certs\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.183565 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-combined-ca-bundle\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.183621 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-public-tls-certs\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.183650 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92800ecd-26fe-48c0-92fc-46d652fe0480-logs\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.183748 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q2mk\" (UniqueName: \"kubernetes.io/projected/92800ecd-26fe-48c0-92fc-46d652fe0480-kube-api-access-8q2mk\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.184006 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-config-data\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.184074 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-config-data-custom\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.286414 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-config-data\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.286470 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-config-data-custom\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.286592 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-internal-tls-certs\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.286776 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-combined-ca-bundle\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.286831 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-public-tls-certs\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.286861 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92800ecd-26fe-48c0-92fc-46d652fe0480-logs\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.286953 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q2mk\" (UniqueName: \"kubernetes.io/projected/92800ecd-26fe-48c0-92fc-46d652fe0480-kube-api-access-8q2mk\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.288429 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92800ecd-26fe-48c0-92fc-46d652fe0480-logs\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.307922 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-combined-ca-bundle\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.308409 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-config-data-custom\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.309766 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-config-data\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.310230 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q2mk\" (UniqueName: \"kubernetes.io/projected/92800ecd-26fe-48c0-92fc-46d652fe0480-kube-api-access-8q2mk\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.314747 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-internal-tls-certs\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.315156 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92800ecd-26fe-48c0-92fc-46d652fe0480-public-tls-certs\") pod \"barbican-api-78b8cf7fb4-c2jnd\" (UID: \"92800ecd-26fe-48c0-92fc-46d652fe0480\") " pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:39 crc kubenswrapper[4834]: I1126 12:26:39.432232 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:40 crc kubenswrapper[4834]: I1126 12:26:40.353462 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64b956f958-2k5zd" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.135:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.135:8443: connect: connection refused" Nov 26 12:26:40 crc kubenswrapper[4834]: I1126 12:26:40.412387 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5895978f64-t9cvb" podUID="9347056f-b194-406f-8b28-bd86ee220403" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.136:8443: connect: connection refused" Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.418916 4834 scope.go:117] "RemoveContainer" containerID="2c286212e07ffff368fe89eae09573970a75dc61ae582ec636aa7168d83f1c53" Nov 26 12:26:43 crc kubenswrapper[4834]: E1126 12:26:43.691828 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.811142 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78b8cf7fb4-c2jnd"] Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.976894 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7459d949b5-hx42r" event={"ID":"1d471582-dd4e-4b07-b99c-16196de70224","Type":"ContainerStarted","Data":"e75fe9dae96ddc3981d58ce2f00adeeb8de2738fe50851d9ab7b7d3e8e6c36bf"} Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.984926 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" event={"ID":"92800ecd-26fe-48c0-92fc-46d652fe0480","Type":"ContainerStarted","Data":"affb87e5dd1694233ef5cf327fdbac12a85cd7c5213183ab2de93de4f28a4984"} Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.987081 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" event={"ID":"6dd7a932-530b-46cd-bfd6-d26679708721","Type":"ContainerStarted","Data":"4c613b595eca57a6ccff2c24e2c04b355a62e28b29d584a16da99b18d422bf2f"} Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.991080 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfde693e-68da-45d5-803a-c8a0e36a198a","Type":"ContainerStarted","Data":"f09126d337e4fd4bf79ca32776e7c4a20525e37f98c4a8bb1cade6f555b4fd09"} Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.991215 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="ceilometer-notification-agent" containerID="cri-o://3cb9146b384da2fc97aabd6065964b95908546eb74a18b646ed19772282e5130" gracePeriod=30 Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.991254 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.991331 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="proxy-httpd" containerID="cri-o://f09126d337e4fd4bf79ca32776e7c4a20525e37f98c4a8bb1cade6f555b4fd09" gracePeriod=30 Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.991382 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="sg-core" containerID="cri-o://3e31d5168665dd1bbdc361ae4db6fe75589cb8a019033458c739cdca0e68ea85" gracePeriod=30 Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.995167 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/2.log" Nov 26 12:26:43 crc kubenswrapper[4834]: I1126 12:26:43.997078 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/1.log" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.001587 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d947cb56d-gkmf6" event={"ID":"35f03545-c45a-4288-8877-d3bb326d6ab8","Type":"ContainerStarted","Data":"1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39"} Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.002325 4834 scope.go:117] "RemoveContainer" containerID="1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39" Nov 26 12:26:44 crc kubenswrapper[4834]: E1126 12:26:44.002552 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-7d947cb56d-gkmf6_openstack(35f03545-c45a-4288-8877-d3bb326d6ab8)\"" pod="openstack/neutron-7d947cb56d-gkmf6" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.008034 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" event={"ID":"e8c66c5d-0081-4f94-8dd5-14489e07cdfd","Type":"ContainerStarted","Data":"af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8"} Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.008466 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.058927 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" podStartSLOduration=9.05890534 podStartE2EDuration="9.05890534s" podCreationTimestamp="2025-11-26 12:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:44.049189763 +0000 UTC m=+901.956403114" watchObservedRunningTime="2025-11-26 12:26:44.05890534 +0000 UTC m=+901.966118692" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.217236 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wlvcf"] Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.218886 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.241370 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlvcf"] Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.339111 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-catalog-content\") pod \"redhat-marketplace-wlvcf\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.339424 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vlj\" (UniqueName: \"kubernetes.io/projected/19845428-2c8a-43df-98f9-def4fc4e7a84-kube-api-access-m5vlj\") pod \"redhat-marketplace-wlvcf\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.339476 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-utilities\") pod \"redhat-marketplace-wlvcf\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.440649 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-utilities\") pod \"redhat-marketplace-wlvcf\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.440770 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-catalog-content\") pod \"redhat-marketplace-wlvcf\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.440848 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vlj\" (UniqueName: \"kubernetes.io/projected/19845428-2c8a-43df-98f9-def4fc4e7a84-kube-api-access-m5vlj\") pod \"redhat-marketplace-wlvcf\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.441634 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-catalog-content\") pod \"redhat-marketplace-wlvcf\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.441823 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-utilities\") pod \"redhat-marketplace-wlvcf\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.457928 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vlj\" (UniqueName: \"kubernetes.io/projected/19845428-2c8a-43df-98f9-def4fc4e7a84-kube-api-access-m5vlj\") pod \"redhat-marketplace-wlvcf\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.547598 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:44 crc kubenswrapper[4834]: I1126 12:26:44.975336 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlvcf"] Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.029687 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kj94c" event={"ID":"0ddc5896-100d-473a-9bed-a2e13560bc8e","Type":"ContainerStarted","Data":"e961004ab1d8e0705e4cd7dc27c910f3006255fa1cdaddd53953741cc52444d3"} Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.041513 4834 generic.go:334] "Generic (PLEG): container finished" podID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerID="f09126d337e4fd4bf79ca32776e7c4a20525e37f98c4a8bb1cade6f555b4fd09" exitCode=0 Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.041547 4834 generic.go:334] "Generic (PLEG): container finished" podID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerID="3e31d5168665dd1bbdc361ae4db6fe75589cb8a019033458c739cdca0e68ea85" exitCode=2 Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.041597 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfde693e-68da-45d5-803a-c8a0e36a198a","Type":"ContainerDied","Data":"f09126d337e4fd4bf79ca32776e7c4a20525e37f98c4a8bb1cade6f555b4fd09"} Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.041658 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfde693e-68da-45d5-803a-c8a0e36a198a","Type":"ContainerDied","Data":"3e31d5168665dd1bbdc361ae4db6fe75589cb8a019033458c739cdca0e68ea85"} Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.044945 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7459d949b5-hx42r" event={"ID":"1d471582-dd4e-4b07-b99c-16196de70224","Type":"ContainerStarted","Data":"a4b92eecb4ca315d4c70a6b13c5ce6eb894846b4420bc9a25f9a448e01e0bee6"} Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.047274 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/2.log" Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.047790 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/1.log" Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.049561 4834 generic.go:334] "Generic (PLEG): container finished" podID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerID="1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39" exitCode=1 Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.049605 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d947cb56d-gkmf6" event={"ID":"35f03545-c45a-4288-8877-d3bb326d6ab8","Type":"ContainerDied","Data":"1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39"} Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.049643 4834 scope.go:117] "RemoveContainer" containerID="2c286212e07ffff368fe89eae09573970a75dc61ae582ec636aa7168d83f1c53" Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.050480 4834 scope.go:117] "RemoveContainer" containerID="1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39" Nov 26 12:26:45 crc kubenswrapper[4834]: E1126 12:26:45.050781 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-7d947cb56d-gkmf6_openstack(35f03545-c45a-4288-8877-d3bb326d6ab8)\"" pod="openstack/neutron-7d947cb56d-gkmf6" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.051720 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlvcf" event={"ID":"19845428-2c8a-43df-98f9-def4fc4e7a84","Type":"ContainerStarted","Data":"8fe1db87f766ab422e47facb47036042eca5666e73d9ce79b29358c265906384"} Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.053797 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" event={"ID":"92800ecd-26fe-48c0-92fc-46d652fe0480","Type":"ContainerStarted","Data":"e60c92d18ad7b060ee5db9ea3ee5f6ca425c51e155d4a58f057f1e3e5f802837"} Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.053825 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" event={"ID":"92800ecd-26fe-48c0-92fc-46d652fe0480","Type":"ContainerStarted","Data":"d2cad61cb8e1ae5748c87f3ed3e73361872d9d58a03ed19d84bd4d1ee821acf7"} Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.053927 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.053957 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.063190 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" event={"ID":"6dd7a932-530b-46cd-bfd6-d26679708721","Type":"ContainerStarted","Data":"b2708a87bc9d66728a5ba21e3a3ecc91d4b2813b2380263963fbdf3ceeb8590f"} Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.063836 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-kj94c" podStartSLOduration=3.182240882 podStartE2EDuration="43.063823093s" podCreationTimestamp="2025-11-26 12:26:02 +0000 UTC" firstStartedPulling="2025-11-26 12:26:03.50302223 +0000 UTC m=+861.410235582" lastFinishedPulling="2025-11-26 12:26:43.384604442 +0000 UTC m=+901.291817793" observedRunningTime="2025-11-26 12:26:45.055082421 +0000 UTC m=+902.962295773" watchObservedRunningTime="2025-11-26 12:26:45.063823093 +0000 UTC m=+902.971036445" Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.072583 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" podStartSLOduration=6.072570397 podStartE2EDuration="6.072570397s" podCreationTimestamp="2025-11-26 12:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:45.070961996 +0000 UTC m=+902.978175349" watchObservedRunningTime="2025-11-26 12:26:45.072570397 +0000 UTC m=+902.979783748" Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.090057 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7459d949b5-hx42r" podStartSLOduration=3.611606372 podStartE2EDuration="10.090042721s" podCreationTimestamp="2025-11-26 12:26:35 +0000 UTC" firstStartedPulling="2025-11-26 12:26:36.850903244 +0000 UTC m=+894.758116595" lastFinishedPulling="2025-11-26 12:26:43.329339592 +0000 UTC m=+901.236552944" observedRunningTime="2025-11-26 12:26:45.088027967 +0000 UTC m=+902.995241309" watchObservedRunningTime="2025-11-26 12:26:45.090042721 +0000 UTC m=+902.997256074" Nov 26 12:26:45 crc kubenswrapper[4834]: I1126 12:26:45.128169 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7fc459bb58-nxgpv" podStartSLOduration=3.5271362120000003 podStartE2EDuration="10.128155488s" podCreationTimestamp="2025-11-26 12:26:35 +0000 UTC" firstStartedPulling="2025-11-26 12:26:36.728282394 +0000 UTC m=+894.635495746" lastFinishedPulling="2025-11-26 12:26:43.32930167 +0000 UTC m=+901.236515022" observedRunningTime="2025-11-26 12:26:45.119721134 +0000 UTC m=+903.026934486" watchObservedRunningTime="2025-11-26 12:26:45.128155488 +0000 UTC m=+903.035368841" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.076238 4834 generic.go:334] "Generic (PLEG): container finished" podID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerID="3cb9146b384da2fc97aabd6065964b95908546eb74a18b646ed19772282e5130" exitCode=0 Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.076342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfde693e-68da-45d5-803a-c8a0e36a198a","Type":"ContainerDied","Data":"3cb9146b384da2fc97aabd6065964b95908546eb74a18b646ed19772282e5130"} Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.076597 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfde693e-68da-45d5-803a-c8a0e36a198a","Type":"ContainerDied","Data":"ccb4280319de17ed85411472b25e27102b342c359fa2bedf9781b5c0f67ac1c7"} Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.076619 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccb4280319de17ed85411472b25e27102b342c359fa2bedf9781b5c0f67ac1c7" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.078348 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/2.log" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.080580 4834 generic.go:334] "Generic (PLEG): container finished" podID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerID="4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353" exitCode=0 Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.080628 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlvcf" event={"ID":"19845428-2c8a-43df-98f9-def4fc4e7a84","Type":"ContainerDied","Data":"4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353"} Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.127778 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.173582 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhrzb\" (UniqueName: \"kubernetes.io/projected/dfde693e-68da-45d5-803a-c8a0e36a198a-kube-api-access-xhrzb\") pod \"dfde693e-68da-45d5-803a-c8a0e36a198a\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.173644 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-log-httpd\") pod \"dfde693e-68da-45d5-803a-c8a0e36a198a\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.173701 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-scripts\") pod \"dfde693e-68da-45d5-803a-c8a0e36a198a\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.173784 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-run-httpd\") pod \"dfde693e-68da-45d5-803a-c8a0e36a198a\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.173814 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-config-data\") pod \"dfde693e-68da-45d5-803a-c8a0e36a198a\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.173886 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-sg-core-conf-yaml\") pod \"dfde693e-68da-45d5-803a-c8a0e36a198a\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.174064 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-combined-ca-bundle\") pod \"dfde693e-68da-45d5-803a-c8a0e36a198a\" (UID: \"dfde693e-68da-45d5-803a-c8a0e36a198a\") " Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.174200 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dfde693e-68da-45d5-803a-c8a0e36a198a" (UID: "dfde693e-68da-45d5-803a-c8a0e36a198a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.174324 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dfde693e-68da-45d5-803a-c8a0e36a198a" (UID: "dfde693e-68da-45d5-803a-c8a0e36a198a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.176219 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.176251 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfde693e-68da-45d5-803a-c8a0e36a198a-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.180943 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfde693e-68da-45d5-803a-c8a0e36a198a-kube-api-access-xhrzb" (OuterVolumeSpecName: "kube-api-access-xhrzb") pod "dfde693e-68da-45d5-803a-c8a0e36a198a" (UID: "dfde693e-68da-45d5-803a-c8a0e36a198a"). InnerVolumeSpecName "kube-api-access-xhrzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.182487 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-scripts" (OuterVolumeSpecName: "scripts") pod "dfde693e-68da-45d5-803a-c8a0e36a198a" (UID: "dfde693e-68da-45d5-803a-c8a0e36a198a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.198734 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dfde693e-68da-45d5-803a-c8a0e36a198a" (UID: "dfde693e-68da-45d5-803a-c8a0e36a198a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.222223 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfde693e-68da-45d5-803a-c8a0e36a198a" (UID: "dfde693e-68da-45d5-803a-c8a0e36a198a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.239551 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-config-data" (OuterVolumeSpecName: "config-data") pod "dfde693e-68da-45d5-803a-c8a0e36a198a" (UID: "dfde693e-68da-45d5-803a-c8a0e36a198a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.278367 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.278618 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhrzb\" (UniqueName: \"kubernetes.io/projected/dfde693e-68da-45d5-803a-c8a0e36a198a-kube-api-access-xhrzb\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.278630 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.278639 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:46 crc kubenswrapper[4834]: I1126 12:26:46.278652 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfde693e-68da-45d5-803a-c8a0e36a198a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.093148 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ddc5896-100d-473a-9bed-a2e13560bc8e" containerID="e961004ab1d8e0705e4cd7dc27c910f3006255fa1cdaddd53953741cc52444d3" exitCode=0 Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.093354 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kj94c" event={"ID":"0ddc5896-100d-473a-9bed-a2e13560bc8e","Type":"ContainerDied","Data":"e961004ab1d8e0705e4cd7dc27c910f3006255fa1cdaddd53953741cc52444d3"} Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.096175 4834 generic.go:334] "Generic (PLEG): container finished" podID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerID="a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465" exitCode=0 Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.096291 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.097639 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlvcf" event={"ID":"19845428-2c8a-43df-98f9-def4fc4e7a84","Type":"ContainerDied","Data":"a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465"} Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.169633 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.181907 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.201056 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:26:47 crc kubenswrapper[4834]: E1126 12:26:47.201638 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="sg-core" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.201713 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="sg-core" Nov 26 12:26:47 crc kubenswrapper[4834]: E1126 12:26:47.201769 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="ceilometer-notification-agent" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.201817 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="ceilometer-notification-agent" Nov 26 12:26:47 crc kubenswrapper[4834]: E1126 12:26:47.201887 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="proxy-httpd" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.201938 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="proxy-httpd" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.202158 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="proxy-httpd" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.202219 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="sg-core" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.202289 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" containerName="ceilometer-notification-agent" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.203876 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.205532 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.208394 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.209543 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.308669 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-config-data\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.308706 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wzsf\" (UniqueName: \"kubernetes.io/projected/f32f5f3a-b8f8-459a-891b-91aad92910c2-kube-api-access-6wzsf\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.308741 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-scripts\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.309055 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.309167 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-log-httpd\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.309203 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-run-httpd\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.309224 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.410794 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-scripts\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.410875 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.410907 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-log-httpd\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.410932 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-run-httpd\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.410956 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.411005 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-config-data\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.411026 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wzsf\" (UniqueName: \"kubernetes.io/projected/f32f5f3a-b8f8-459a-891b-91aad92910c2-kube-api-access-6wzsf\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.412742 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-run-httpd\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.415626 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-log-httpd\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.416989 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-scripts\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.417986 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.418811 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.419795 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-config-data\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.427515 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wzsf\" (UniqueName: \"kubernetes.io/projected/f32f5f3a-b8f8-459a-891b-91aad92910c2-kube-api-access-6wzsf\") pod \"ceilometer-0\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.518787 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.634342 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.689088 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:47 crc kubenswrapper[4834]: I1126 12:26:47.937757 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.107421 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerStarted","Data":"0d049cf1fa9bb8e4c2fa346ae75b72579c28aa324d1f9568a7df46818435e31f"} Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.109932 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlvcf" event={"ID":"19845428-2c8a-43df-98f9-def4fc4e7a84","Type":"ContainerStarted","Data":"135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea"} Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.388826 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.406626 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wlvcf" podStartSLOduration=2.851520319 podStartE2EDuration="4.406610933s" podCreationTimestamp="2025-11-26 12:26:44 +0000 UTC" firstStartedPulling="2025-11-26 12:26:46.082174554 +0000 UTC m=+903.989387906" lastFinishedPulling="2025-11-26 12:26:47.637265168 +0000 UTC m=+905.544478520" observedRunningTime="2025-11-26 12:26:48.12718259 +0000 UTC m=+906.034395943" watchObservedRunningTime="2025-11-26 12:26:48.406610933 +0000 UTC m=+906.313824275" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.434102 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfde693e-68da-45d5-803a-c8a0e36a198a" path="/var/lib/kubelet/pods/dfde693e-68da-45d5-803a-c8a0e36a198a/volumes" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.436003 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn8sn\" (UniqueName: \"kubernetes.io/projected/0ddc5896-100d-473a-9bed-a2e13560bc8e-kube-api-access-gn8sn\") pod \"0ddc5896-100d-473a-9bed-a2e13560bc8e\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.436066 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-scripts\") pod \"0ddc5896-100d-473a-9bed-a2e13560bc8e\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.436178 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-combined-ca-bundle\") pod \"0ddc5896-100d-473a-9bed-a2e13560bc8e\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.436327 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-db-sync-config-data\") pod \"0ddc5896-100d-473a-9bed-a2e13560bc8e\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.436446 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ddc5896-100d-473a-9bed-a2e13560bc8e-etc-machine-id\") pod \"0ddc5896-100d-473a-9bed-a2e13560bc8e\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.436503 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-config-data\") pod \"0ddc5896-100d-473a-9bed-a2e13560bc8e\" (UID: \"0ddc5896-100d-473a-9bed-a2e13560bc8e\") " Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.436556 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ddc5896-100d-473a-9bed-a2e13560bc8e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0ddc5896-100d-473a-9bed-a2e13560bc8e" (UID: "0ddc5896-100d-473a-9bed-a2e13560bc8e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.437172 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ddc5896-100d-473a-9bed-a2e13560bc8e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.444513 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-scripts" (OuterVolumeSpecName: "scripts") pod "0ddc5896-100d-473a-9bed-a2e13560bc8e" (UID: "0ddc5896-100d-473a-9bed-a2e13560bc8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.444642 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ddc5896-100d-473a-9bed-a2e13560bc8e-kube-api-access-gn8sn" (OuterVolumeSpecName: "kube-api-access-gn8sn") pod "0ddc5896-100d-473a-9bed-a2e13560bc8e" (UID: "0ddc5896-100d-473a-9bed-a2e13560bc8e"). InnerVolumeSpecName "kube-api-access-gn8sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.445560 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0ddc5896-100d-473a-9bed-a2e13560bc8e" (UID: "0ddc5896-100d-473a-9bed-a2e13560bc8e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.478446 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ddc5896-100d-473a-9bed-a2e13560bc8e" (UID: "0ddc5896-100d-473a-9bed-a2e13560bc8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.489299 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-config-data" (OuterVolumeSpecName: "config-data") pod "0ddc5896-100d-473a-9bed-a2e13560bc8e" (UID: "0ddc5896-100d-473a-9bed-a2e13560bc8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.538683 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.538714 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn8sn\" (UniqueName: \"kubernetes.io/projected/0ddc5896-100d-473a-9bed-a2e13560bc8e-kube-api-access-gn8sn\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.538728 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.538738 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:48 crc kubenswrapper[4834]: I1126 12:26:48.538748 4834 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ddc5896-100d-473a-9bed-a2e13560bc8e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.119663 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerStarted","Data":"9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5"} Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.121811 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kj94c" event={"ID":"0ddc5896-100d-473a-9bed-a2e13560bc8e","Type":"ContainerDied","Data":"1a536e397ec042d75db0e603a6432f8790589fbffd70e293d7157f7ad85826d2"} Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.121860 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a536e397ec042d75db0e603a6432f8790589fbffd70e293d7157f7ad85826d2" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.121884 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kj94c" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.292468 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.293384 4834 scope.go:117] "RemoveContainer" containerID="1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39" Nov 26 12:26:49 crc kubenswrapper[4834]: E1126 12:26:49.293569 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-7d947cb56d-gkmf6_openstack(35f03545-c45a-4288-8877-d3bb326d6ab8)\"" pod="openstack/neutron-7d947cb56d-gkmf6" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.294124 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.297949 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-7d947cb56d-gkmf6" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-api" probeResult="failure" output="Get \"http://10.217.0.139:9696/\": dial tcp 10.217.0.139:9696: connect: connection refused" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.372089 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 12:26:49 crc kubenswrapper[4834]: E1126 12:26:49.372460 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ddc5896-100d-473a-9bed-a2e13560bc8e" containerName="cinder-db-sync" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.372479 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddc5896-100d-473a-9bed-a2e13560bc8e" containerName="cinder-db-sync" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.372629 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ddc5896-100d-473a-9bed-a2e13560bc8e" containerName="cinder-db-sync" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.373444 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.394332 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.399830 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.400016 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x8jns" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.400130 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.400651 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.450907 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7776d59f89-jv7g8"] Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.451075 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" podUID="e8c66c5d-0081-4f94-8dd5-14489e07cdfd" containerName="dnsmasq-dns" containerID="cri-o://af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8" gracePeriod=10 Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.452411 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.456678 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bc89f58d7-tg9dw"] Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.457916 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.470347 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.470398 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cfrl\" (UniqueName: \"kubernetes.io/projected/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-kube-api-access-7cfrl\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.470428 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.470450 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.470467 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.470533 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.483994 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc89f58d7-tg9dw"] Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572034 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-dns-svc\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572091 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-config\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572112 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572134 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572161 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsggf\" (UniqueName: \"kubernetes.io/projected/2281f6ae-aa77-445f-9f3e-4c15ce93debf-kube-api-access-jsggf\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572183 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572204 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cfrl\" (UniqueName: \"kubernetes.io/projected/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-kube-api-access-7cfrl\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572229 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572252 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572279 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.572342 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.573052 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.577995 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.580206 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.581989 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.583513 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.608921 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cfrl\" (UniqueName: \"kubernetes.io/projected/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-kube-api-access-7cfrl\") pod \"cinder-scheduler-0\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.636046 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.641752 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.641911 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.644358 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.674023 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-dns-svc\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.674086 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-config\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.674104 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.674135 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsggf\" (UniqueName: \"kubernetes.io/projected/2281f6ae-aa77-445f-9f3e-4c15ce93debf-kube-api-access-jsggf\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.674157 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.675090 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.675285 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-dns-svc\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.675485 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.678076 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-config\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.690472 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsggf\" (UniqueName: \"kubernetes.io/projected/2281f6ae-aa77-445f-9f3e-4c15ce93debf-kube-api-access-jsggf\") pod \"dnsmasq-dns-7bc89f58d7-tg9dw\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.751385 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.776116 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.776263 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92aa52d4-c3ae-49e3-b766-2b5330df94e1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.776462 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.776573 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqtfz\" (UniqueName: \"kubernetes.io/projected/92aa52d4-c3ae-49e3-b766-2b5330df94e1-kube-api-access-hqtfz\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.776689 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-scripts\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.776818 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92aa52d4-c3ae-49e3-b766-2b5330df94e1-logs\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.776928 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data-custom\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.836897 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.878287 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.878461 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92aa52d4-c3ae-49e3-b766-2b5330df94e1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.878573 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.878650 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqtfz\" (UniqueName: \"kubernetes.io/projected/92aa52d4-c3ae-49e3-b766-2b5330df94e1-kube-api-access-hqtfz\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.878722 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-scripts\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.878791 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92aa52d4-c3ae-49e3-b766-2b5330df94e1-logs\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.878857 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data-custom\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.879571 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92aa52d4-c3ae-49e3-b766-2b5330df94e1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.879981 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92aa52d4-c3ae-49e3-b766-2b5330df94e1-logs\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.891429 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-scripts\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.891473 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.891657 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.892460 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data-custom\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.901406 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqtfz\" (UniqueName: \"kubernetes.io/projected/92aa52d4-c3ae-49e3-b766-2b5330df94e1-kube-api-access-hqtfz\") pod \"cinder-api-0\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " pod="openstack/cinder-api-0" Nov 26 12:26:49 crc kubenswrapper[4834]: I1126 12:26:49.963446 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.067476 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.139861 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerStarted","Data":"b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a"} Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.142491 4834 generic.go:334] "Generic (PLEG): container finished" podID="e8c66c5d-0081-4f94-8dd5-14489e07cdfd" containerID="af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8" exitCode=0 Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.142580 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" event={"ID":"e8c66c5d-0081-4f94-8dd5-14489e07cdfd","Type":"ContainerDied","Data":"af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8"} Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.142625 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" event={"ID":"e8c66c5d-0081-4f94-8dd5-14489e07cdfd","Type":"ContainerDied","Data":"52e69d4b39894fc868d122406c6b22a54343b5547d5600a7ebbacdd2837dbbe8"} Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.142644 4834 scope.go:117] "RemoveContainer" containerID="af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.142693 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7776d59f89-jv7g8" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.143337 4834 scope.go:117] "RemoveContainer" containerID="1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39" Nov 26 12:26:50 crc kubenswrapper[4834]: E1126 12:26:50.143704 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-7d947cb56d-gkmf6_openstack(35f03545-c45a-4288-8877-d3bb326d6ab8)\"" pod="openstack/neutron-7d947cb56d-gkmf6" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.169347 4834 scope.go:117] "RemoveContainer" containerID="7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.184865 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-config\") pod \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.184908 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-nb\") pod \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.184948 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-dns-svc\") pod \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.185108 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-sb\") pod \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.185288 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twgrn\" (UniqueName: \"kubernetes.io/projected/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-kube-api-access-twgrn\") pod \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\" (UID: \"e8c66c5d-0081-4f94-8dd5-14489e07cdfd\") " Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.196081 4834 scope.go:117] "RemoveContainer" containerID="af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8" Nov 26 12:26:50 crc kubenswrapper[4834]: E1126 12:26:50.199452 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8\": container with ID starting with af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8 not found: ID does not exist" containerID="af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.199495 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8"} err="failed to get container status \"af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8\": rpc error: code = NotFound desc = could not find container \"af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8\": container with ID starting with af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8 not found: ID does not exist" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.199525 4834 scope.go:117] "RemoveContainer" containerID="7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.203493 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-kube-api-access-twgrn" (OuterVolumeSpecName: "kube-api-access-twgrn") pod "e8c66c5d-0081-4f94-8dd5-14489e07cdfd" (UID: "e8c66c5d-0081-4f94-8dd5-14489e07cdfd"). InnerVolumeSpecName "kube-api-access-twgrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:50 crc kubenswrapper[4834]: E1126 12:26:50.203580 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f\": container with ID starting with 7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f not found: ID does not exist" containerID="7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.203606 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f"} err="failed to get container status \"7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f\": rpc error: code = NotFound desc = could not find container \"7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f\": container with ID starting with 7b050604b1a7821f3990c89668a8ccba8d25e8088b4eefa58540cc6002dccf3f not found: ID does not exist" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.229505 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-config" (OuterVolumeSpecName: "config") pod "e8c66c5d-0081-4f94-8dd5-14489e07cdfd" (UID: "e8c66c5d-0081-4f94-8dd5-14489e07cdfd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.237823 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8c66c5d-0081-4f94-8dd5-14489e07cdfd" (UID: "e8c66c5d-0081-4f94-8dd5-14489e07cdfd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.241690 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8c66c5d-0081-4f94-8dd5-14489e07cdfd" (UID: "e8c66c5d-0081-4f94-8dd5-14489e07cdfd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.249649 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8c66c5d-0081-4f94-8dd5-14489e07cdfd" (UID: "e8c66c5d-0081-4f94-8dd5-14489e07cdfd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.287830 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.287880 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twgrn\" (UniqueName: \"kubernetes.io/projected/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-kube-api-access-twgrn\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.287892 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.287901 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.287911 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c66c5d-0081-4f94-8dd5-14489e07cdfd-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.300707 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.413751 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc89f58d7-tg9dw"] Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.468413 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7776d59f89-jv7g8"] Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.474803 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7776d59f89-jv7g8"] Nov 26 12:26:50 crc kubenswrapper[4834]: I1126 12:26:50.552355 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.069826 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.162198 4834 generic.go:334] "Generic (PLEG): container finished" podID="2281f6ae-aa77-445f-9f3e-4c15ce93debf" containerID="06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a" exitCode=0 Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.162261 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" event={"ID":"2281f6ae-aa77-445f-9f3e-4c15ce93debf","Type":"ContainerDied","Data":"06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a"} Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.162335 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" event={"ID":"2281f6ae-aa77-445f-9f3e-4c15ce93debf","Type":"ContainerStarted","Data":"9cd308f73bdb257d6ae523b01c8f600b335ea81473b5aa9f39a3e5379a2390da"} Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.165198 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerStarted","Data":"ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2"} Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.166703 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"92aa52d4-c3ae-49e3-b766-2b5330df94e1","Type":"ContainerStarted","Data":"5e2c412673054a09fb0cde8c455621fc81c4c80989e0df8364a5070d4bf82b01"} Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.169475 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"75be0f1c-a91a-428a-a8a0-7cab480d8c2a","Type":"ContainerStarted","Data":"53c6a089ea0298398d5d054c62b91f6d870a3a212684885e61344938ac4e603b"} Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.169782 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78b8cf7fb4-c2jnd" Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.252654 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bcb769647-fxvwp" Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.265046 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d49945c76-8p2wx"] Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.265292 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d49945c76-8p2wx" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api-log" containerID="cri-o://1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5" gracePeriod=30 Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.265605 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d49945c76-8p2wx" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api" containerID="cri-o://3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416" gracePeriod=30 Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.314177 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d947cb56d-gkmf6"] Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.314601 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7d947cb56d-gkmf6" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-api" containerID="cri-o://bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8" gracePeriod=30 Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.515016 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.531366 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:26:51 crc kubenswrapper[4834]: I1126 12:26:51.531401 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.181139 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" event={"ID":"2281f6ae-aa77-445f-9f3e-4c15ce93debf","Type":"ContainerStarted","Data":"6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4"} Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.181423 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.185169 4834 generic.go:334] "Generic (PLEG): container finished" podID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerID="1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5" exitCode=143 Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.185228 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d49945c76-8p2wx" event={"ID":"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b","Type":"ContainerDied","Data":"1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5"} Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.188951 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"92aa52d4-c3ae-49e3-b766-2b5330df94e1","Type":"ContainerStarted","Data":"7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5"} Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.214175 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" podStartSLOduration=3.214156913 podStartE2EDuration="3.214156913s" podCreationTimestamp="2025-11-26 12:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:52.20695027 +0000 UTC m=+910.114163623" watchObservedRunningTime="2025-11-26 12:26:52.214156913 +0000 UTC m=+910.121370255" Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.443237 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c66c5d-0081-4f94-8dd5-14489e07cdfd" path="/var/lib/kubelet/pods/e8c66c5d-0081-4f94-8dd5-14489e07cdfd/volumes" Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.451261 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.612748 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.929366 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/2.log" Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.929952 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.976865 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-httpd-config\") pod \"35f03545-c45a-4288-8877-d3bb326d6ab8\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.976934 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssdlp\" (UniqueName: \"kubernetes.io/projected/35f03545-c45a-4288-8877-d3bb326d6ab8-kube-api-access-ssdlp\") pod \"35f03545-c45a-4288-8877-d3bb326d6ab8\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.976963 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-ovndb-tls-certs\") pod \"35f03545-c45a-4288-8877-d3bb326d6ab8\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.977025 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-config\") pod \"35f03545-c45a-4288-8877-d3bb326d6ab8\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.977077 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-combined-ca-bundle\") pod \"35f03545-c45a-4288-8877-d3bb326d6ab8\" (UID: \"35f03545-c45a-4288-8877-d3bb326d6ab8\") " Nov 26 12:26:52 crc kubenswrapper[4834]: I1126 12:26:52.997456 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f03545-c45a-4288-8877-d3bb326d6ab8-kube-api-access-ssdlp" (OuterVolumeSpecName: "kube-api-access-ssdlp") pod "35f03545-c45a-4288-8877-d3bb326d6ab8" (UID: "35f03545-c45a-4288-8877-d3bb326d6ab8"). InnerVolumeSpecName "kube-api-access-ssdlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.020024 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "35f03545-c45a-4288-8877-d3bb326d6ab8" (UID: "35f03545-c45a-4288-8877-d3bb326d6ab8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.082076 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.082104 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssdlp\" (UniqueName: \"kubernetes.io/projected/35f03545-c45a-4288-8877-d3bb326d6ab8-kube-api-access-ssdlp\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.082758 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35f03545-c45a-4288-8877-d3bb326d6ab8" (UID: "35f03545-c45a-4288-8877-d3bb326d6ab8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.087921 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-config" (OuterVolumeSpecName: "config") pod "35f03545-c45a-4288-8877-d3bb326d6ab8" (UID: "35f03545-c45a-4288-8877-d3bb326d6ab8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.120445 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "35f03545-c45a-4288-8877-d3bb326d6ab8" (UID: "35f03545-c45a-4288-8877-d3bb326d6ab8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.184254 4834 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.184292 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.184303 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f03545-c45a-4288-8877-d3bb326d6ab8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.200850 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"75be0f1c-a91a-428a-a8a0-7cab480d8c2a","Type":"ContainerStarted","Data":"90c5dbe11347bc7e18488697192007690a99c7ecc9f00c441a2f7ffb56c79440"} Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.200899 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"75be0f1c-a91a-428a-a8a0-7cab480d8c2a","Type":"ContainerStarted","Data":"0af6fbc528da14750d78fc5ae6a22c310a7be054dbb6d32f6e4fc4b8fa853e8e"} Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.203779 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerStarted","Data":"4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479"} Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.203913 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.205728 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"92aa52d4-c3ae-49e3-b766-2b5330df94e1","Type":"ContainerStarted","Data":"b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77"} Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.205810 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerName="cinder-api-log" containerID="cri-o://7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5" gracePeriod=30 Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.205852 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerName="cinder-api" containerID="cri-o://b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77" gracePeriod=30 Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.205875 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.207896 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d947cb56d-gkmf6_35f03545-c45a-4288-8877-d3bb326d6ab8/neutron-httpd/2.log" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.208220 4834 generic.go:334] "Generic (PLEG): container finished" podID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerID="bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8" exitCode=0 Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.209024 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d947cb56d-gkmf6" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.209219 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d947cb56d-gkmf6" event={"ID":"35f03545-c45a-4288-8877-d3bb326d6ab8","Type":"ContainerDied","Data":"bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8"} Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.209279 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d947cb56d-gkmf6" event={"ID":"35f03545-c45a-4288-8877-d3bb326d6ab8","Type":"ContainerDied","Data":"44ff63c5dbfd5a524a153afc3989e0c34391df7235181d2c7431f3c0056d4fb3"} Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.209304 4834 scope.go:117] "RemoveContainer" containerID="1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.222818 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.211576523 podStartE2EDuration="4.22279909s" podCreationTimestamp="2025-11-26 12:26:49 +0000 UTC" firstStartedPulling="2025-11-26 12:26:50.303995803 +0000 UTC m=+908.211209154" lastFinishedPulling="2025-11-26 12:26:51.315218369 +0000 UTC m=+909.222431721" observedRunningTime="2025-11-26 12:26:53.217693725 +0000 UTC m=+911.124907077" watchObservedRunningTime="2025-11-26 12:26:53.22279909 +0000 UTC m=+911.130012442" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.237371 4834 scope.go:117] "RemoveContainer" containerID="bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.244662 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.221519244 podStartE2EDuration="6.244643088s" podCreationTimestamp="2025-11-26 12:26:47 +0000 UTC" firstStartedPulling="2025-11-26 12:26:47.943722332 +0000 UTC m=+905.850935685" lastFinishedPulling="2025-11-26 12:26:51.966846178 +0000 UTC m=+909.874059529" observedRunningTime="2025-11-26 12:26:53.238173785 +0000 UTC m=+911.145387137" watchObservedRunningTime="2025-11-26 12:26:53.244643088 +0000 UTC m=+911.151856440" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.271189 4834 scope.go:117] "RemoveContainer" containerID="1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.271354 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.271333334 podStartE2EDuration="4.271333334s" podCreationTimestamp="2025-11-26 12:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:53.251142469 +0000 UTC m=+911.158355820" watchObservedRunningTime="2025-11-26 12:26:53.271333334 +0000 UTC m=+911.178546686" Nov 26 12:26:53 crc kubenswrapper[4834]: E1126 12:26:53.273902 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39\": container with ID starting with 1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39 not found: ID does not exist" containerID="1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.273938 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39"} err="failed to get container status \"1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39\": rpc error: code = NotFound desc = could not find container \"1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39\": container with ID starting with 1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39 not found: ID does not exist" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.273960 4834 scope.go:117] "RemoveContainer" containerID="bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8" Nov 26 12:26:53 crc kubenswrapper[4834]: E1126 12:26:53.274232 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8\": container with ID starting with bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8 not found: ID does not exist" containerID="bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.274277 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8"} err="failed to get container status \"bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8\": rpc error: code = NotFound desc = could not find container \"bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8\": container with ID starting with bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8 not found: ID does not exist" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.281876 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d947cb56d-gkmf6"] Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.289509 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7d947cb56d-gkmf6"] Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.729923 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.794602 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data-custom\") pod \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.794650 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-scripts\") pod \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.794711 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqtfz\" (UniqueName: \"kubernetes.io/projected/92aa52d4-c3ae-49e3-b766-2b5330df94e1-kube-api-access-hqtfz\") pod \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.795433 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92aa52d4-c3ae-49e3-b766-2b5330df94e1-etc-machine-id\") pod \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.795485 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-combined-ca-bundle\") pod \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.795556 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data\") pod \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.795567 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92aa52d4-c3ae-49e3-b766-2b5330df94e1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "92aa52d4-c3ae-49e3-b766-2b5330df94e1" (UID: "92aa52d4-c3ae-49e3-b766-2b5330df94e1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.795624 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92aa52d4-c3ae-49e3-b766-2b5330df94e1-logs\") pod \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\" (UID: \"92aa52d4-c3ae-49e3-b766-2b5330df94e1\") " Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.796066 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92aa52d4-c3ae-49e3-b766-2b5330df94e1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.796916 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92aa52d4-c3ae-49e3-b766-2b5330df94e1-logs" (OuterVolumeSpecName: "logs") pod "92aa52d4-c3ae-49e3-b766-2b5330df94e1" (UID: "92aa52d4-c3ae-49e3-b766-2b5330df94e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.800430 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-scripts" (OuterVolumeSpecName: "scripts") pod "92aa52d4-c3ae-49e3-b766-2b5330df94e1" (UID: "92aa52d4-c3ae-49e3-b766-2b5330df94e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.801683 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "92aa52d4-c3ae-49e3-b766-2b5330df94e1" (UID: "92aa52d4-c3ae-49e3-b766-2b5330df94e1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.812796 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92aa52d4-c3ae-49e3-b766-2b5330df94e1-kube-api-access-hqtfz" (OuterVolumeSpecName: "kube-api-access-hqtfz") pod "92aa52d4-c3ae-49e3-b766-2b5330df94e1" (UID: "92aa52d4-c3ae-49e3-b766-2b5330df94e1"). InnerVolumeSpecName "kube-api-access-hqtfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.823434 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92aa52d4-c3ae-49e3-b766-2b5330df94e1" (UID: "92aa52d4-c3ae-49e3-b766-2b5330df94e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.839542 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data" (OuterVolumeSpecName: "config-data") pod "92aa52d4-c3ae-49e3-b766-2b5330df94e1" (UID: "92aa52d4-c3ae-49e3-b766-2b5330df94e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.898327 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.898362 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.898372 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92aa52d4-c3ae-49e3-b766-2b5330df94e1-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.898383 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.898392 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92aa52d4-c3ae-49e3-b766-2b5330df94e1-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:53 crc kubenswrapper[4834]: I1126 12:26:53.898399 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqtfz\" (UniqueName: \"kubernetes.io/projected/92aa52d4-c3ae-49e3-b766-2b5330df94e1-kube-api-access-hqtfz\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.134285 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.185848 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5895978f64-t9cvb" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.230753 4834 generic.go:334] "Generic (PLEG): container finished" podID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerID="b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77" exitCode=0 Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.230782 4834 generic.go:334] "Generic (PLEG): container finished" podID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerID="7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5" exitCode=143 Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.230788 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"92aa52d4-c3ae-49e3-b766-2b5330df94e1","Type":"ContainerDied","Data":"b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77"} Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.230825 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"92aa52d4-c3ae-49e3-b766-2b5330df94e1","Type":"ContainerDied","Data":"7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5"} Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.230829 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.230848 4834 scope.go:117] "RemoveContainer" containerID="b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.230837 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"92aa52d4-c3ae-49e3-b766-2b5330df94e1","Type":"ContainerDied","Data":"5e2c412673054a09fb0cde8c455621fc81c4c80989e0df8364a5070d4bf82b01"} Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.234655 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64b956f958-2k5zd"] Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.237130 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64b956f958-2k5zd" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon-log" containerID="cri-o://b1d2249bb255822f9a4b38553c1dc1bfbdc8ce15098e178e88a006df9ed337af" gracePeriod=30 Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.237768 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64b956f958-2k5zd" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon" containerID="cri-o://9e4fcaac3592dd2cfcbe81a7173beea0340ff06e304762077d856046e28d7416" gracePeriod=30 Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.259989 4834 scope.go:117] "RemoveContainer" containerID="7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.268248 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.276618 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.285733 4834 scope.go:117] "RemoveContainer" containerID="b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77" Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.288745 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77\": container with ID starting with b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77 not found: ID does not exist" containerID="b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.288784 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77"} err="failed to get container status \"b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77\": rpc error: code = NotFound desc = could not find container \"b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77\": container with ID starting with b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77 not found: ID does not exist" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.288821 4834 scope.go:117] "RemoveContainer" containerID="7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5" Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.289102 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5\": container with ID starting with 7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5 not found: ID does not exist" containerID="7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.289117 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5"} err="failed to get container status \"7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5\": rpc error: code = NotFound desc = could not find container \"7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5\": container with ID starting with 7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5 not found: ID does not exist" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.289135 4834 scope.go:117] "RemoveContainer" containerID="b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.290091 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77"} err="failed to get container status \"b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77\": rpc error: code = NotFound desc = could not find container \"b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77\": container with ID starting with b396bec9b1dba56ceebe4a75969b747ed9c3078d4a18087e0715941d3027fe77 not found: ID does not exist" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.290146 4834 scope.go:117] "RemoveContainer" containerID="7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.290942 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5"} err="failed to get container status \"7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5\": rpc error: code = NotFound desc = could not find container \"7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5\": container with ID starting with 7d156fb17b8870c903f3d21fe9a4b2713a02004cc5b272446fa8258cc8999ca5 not found: ID does not exist" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.291518 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.291909 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-api" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.291935 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-api" Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.291947 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-httpd" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.291954 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-httpd" Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.291963 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerName="cinder-api" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.291971 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerName="cinder-api" Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.291983 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-httpd" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.291988 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-httpd" Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.292002 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-httpd" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.292009 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-httpd" Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.292040 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c66c5d-0081-4f94-8dd5-14489e07cdfd" containerName="dnsmasq-dns" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.292046 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c66c5d-0081-4f94-8dd5-14489e07cdfd" containerName="dnsmasq-dns" Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.292057 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerName="cinder-api-log" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.292064 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerName="cinder-api-log" Nov 26 12:26:54 crc kubenswrapper[4834]: E1126 12:26:54.292072 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c66c5d-0081-4f94-8dd5-14489e07cdfd" containerName="init" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.292078 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c66c5d-0081-4f94-8dd5-14489e07cdfd" containerName="init" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.292248 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerName="cinder-api" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.292279 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-api" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.292288 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c66c5d-0081-4f94-8dd5-14489e07cdfd" containerName="dnsmasq-dns" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.292298 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-httpd" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.294394 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-httpd" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.294442 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" containerName="cinder-api-log" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.294459 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" containerName="neutron-httpd" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.295741 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.299193 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.299474 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.299706 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.302024 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.412657 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-config-data\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.412731 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.412776 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.412803 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.412844 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vch28\" (UniqueName: \"kubernetes.io/projected/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-kube-api-access-vch28\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.412880 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.412915 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.412939 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-logs\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.412956 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-scripts\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.449889 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f03545-c45a-4288-8877-d3bb326d6ab8" path="/var/lib/kubelet/pods/35f03545-c45a-4288-8877-d3bb326d6ab8/volumes" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.450662 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92aa52d4-c3ae-49e3-b766-2b5330df94e1" path="/var/lib/kubelet/pods/92aa52d4-c3ae-49e3-b766-2b5330df94e1/volumes" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.514600 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-logs\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.514646 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-scripts\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.514681 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-config-data\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.514740 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.514797 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.514833 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.514899 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vch28\" (UniqueName: \"kubernetes.io/projected/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-kube-api-access-vch28\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.514960 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.515032 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.515162 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.515158 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-logs\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.519601 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.519744 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.523667 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-scripts\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.523945 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.524112 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-config-data\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.524679 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.529941 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vch28\" (UniqueName: \"kubernetes.io/projected/a5845ee5-aed4-4450-a04c-2e25bd2dc0f2-kube-api-access-vch28\") pod \"cinder-api-0\" (UID: \"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2\") " pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.548289 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.548352 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.585766 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.624356 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.667848 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d49945c76-8p2wx" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:54832->10.217.0.146:9311: read: connection reset by peer" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.667925 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d49945c76-8p2wx" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:54828->10.217.0.146:9311: read: connection reset by peer" Nov 26 12:26:54 crc kubenswrapper[4834]: I1126 12:26:54.752412 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.037534 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.134143 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.227697 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-combined-ca-bundle\") pod \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.227981 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data\") pod \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.228027 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-logs\") pod \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.228349 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data-custom\") pod \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.228426 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6pvw\" (UniqueName: \"kubernetes.io/projected/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-kube-api-access-q6pvw\") pod \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\" (UID: \"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b\") " Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.228828 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-logs" (OuterVolumeSpecName: "logs") pod "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" (UID: "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.232392 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" (UID: "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.244122 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-kube-api-access-q6pvw" (OuterVolumeSpecName: "kube-api-access-q6pvw") pod "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" (UID: "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b"). InnerVolumeSpecName "kube-api-access-q6pvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.260319 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2","Type":"ContainerStarted","Data":"a75ab3e0f610a78c501bbd426f28e450605e8ea319bb15d813c42de2001c9144"} Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.265509 4834 generic.go:334] "Generic (PLEG): container finished" podID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerID="3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416" exitCode=0 Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.265566 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d49945c76-8p2wx" event={"ID":"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b","Type":"ContainerDied","Data":"3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416"} Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.265595 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d49945c76-8p2wx" event={"ID":"d9bdd80a-b1f9-42a7-8a60-4b514c778d4b","Type":"ContainerDied","Data":"f74a87f1cecc4623678dccd7290ad1495faa6e21b97a730dfb96eead82927acc"} Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.265621 4834 scope.go:117] "RemoveContainer" containerID="3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.265775 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d49945c76-8p2wx" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.271536 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" (UID: "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.288781 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data" (OuterVolumeSpecName: "config-data") pod "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" (UID: "d9bdd80a-b1f9-42a7-8a60-4b514c778d4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.315447 4834 scope.go:117] "RemoveContainer" containerID="1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.331171 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.331197 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.331206 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.331217 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6pvw\" (UniqueName: \"kubernetes.io/projected/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-kube-api-access-q6pvw\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.331227 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.346525 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.351497 4834 scope.go:117] "RemoveContainer" containerID="3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416" Nov 26 12:26:55 crc kubenswrapper[4834]: E1126 12:26:55.352474 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416\": container with ID starting with 3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416 not found: ID does not exist" containerID="3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.352513 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416"} err="failed to get container status \"3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416\": rpc error: code = NotFound desc = could not find container \"3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416\": container with ID starting with 3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416 not found: ID does not exist" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.352540 4834 scope.go:117] "RemoveContainer" containerID="1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5" Nov 26 12:26:55 crc kubenswrapper[4834]: E1126 12:26:55.352886 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5\": container with ID starting with 1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5 not found: ID does not exist" containerID="1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.352950 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5"} err="failed to get container status \"1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5\": rpc error: code = NotFound desc = could not find container \"1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5\": container with ID starting with 1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5 not found: ID does not exist" Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.391321 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlvcf"] Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.597756 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d49945c76-8p2wx"] Nov 26 12:26:55 crc kubenswrapper[4834]: I1126 12:26:55.606638 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6d49945c76-8p2wx"] Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.287273 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2","Type":"ContainerStarted","Data":"3f4197987614c4d27c627a7af6fb1fd0160279c99eb53e815b626acf0569358c"} Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.287605 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5845ee5-aed4-4450-a04c-2e25bd2dc0f2","Type":"ContainerStarted","Data":"5f431ecb875eeaf76a7f39e90539a123b5e71de354fb90bede4de4ba0422ffc2"} Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.287627 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.310686 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.310662246 podStartE2EDuration="2.310662246s" podCreationTimestamp="2025-11-26 12:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:26:56.304598094 +0000 UTC m=+914.211811446" watchObservedRunningTime="2025-11-26 12:26:56.310662246 +0000 UTC m=+914.217875597" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.429521 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" path="/var/lib/kubelet/pods/d9bdd80a-b1f9-42a7-8a60-4b514c778d4b/volumes" Nov 26 12:26:56 crc kubenswrapper[4834]: W1126 12:26:56.469719 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8c66c5d_0081_4f94_8dd5_14489e07cdfd.slice/crio-af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8.scope WatchSource:0}: Error finding container af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8: Status 404 returned error can't find the container with id af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8 Nov 26 12:26:56 crc kubenswrapper[4834]: W1126 12:26:56.478713 4834 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ddc5896_100d_473a_9bed_a2e13560bc8e.slice/crio-conmon-e961004ab1d8e0705e4cd7dc27c910f3006255fa1cdaddd53953741cc52444d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ddc5896_100d_473a_9bed_a2e13560bc8e.slice/crio-conmon-e961004ab1d8e0705e4cd7dc27c910f3006255fa1cdaddd53953741cc52444d3.scope: no such file or directory Nov 26 12:26:56 crc kubenswrapper[4834]: W1126 12:26:56.478782 4834 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ddc5896_100d_473a_9bed_a2e13560bc8e.slice/crio-e961004ab1d8e0705e4cd7dc27c910f3006255fa1cdaddd53953741cc52444d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ddc5896_100d_473a_9bed_a2e13560bc8e.slice/crio-e961004ab1d8e0705e4cd7dc27c910f3006255fa1cdaddd53953741cc52444d3.scope: no such file or directory Nov 26 12:26:56 crc kubenswrapper[4834]: W1126 12:26:56.479118 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f03545_c45a_4288_8877_d3bb326d6ab8.slice/crio-1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39.scope WatchSource:0}: Error finding container 1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39: Status 404 returned error can't find the container with id 1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39 Nov 26 12:26:56 crc kubenswrapper[4834]: W1126 12:26:56.542111 4834 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92aa52d4_c3ae_49e3_b766_2b5330df94e1.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92aa52d4_c3ae_49e3_b766_2b5330df94e1.slice: no such file or directory Nov 26 12:26:56 crc kubenswrapper[4834]: E1126 12:26:56.702663 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f03545_c45a_4288_8877_d3bb326d6ab8.slice/crio-44ff63c5dbfd5a524a153afc3989e0c34391df7235181d2c7431f3c0056d4fb3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfde693e_68da_45d5_803a_c8a0e36a198a.slice/crio-conmon-f09126d337e4fd4bf79ca32776e7c4a20525e37f98c4a8bb1cade6f555b4fd09.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f03545_c45a_4288_8877_d3bb326d6ab8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f03545_c45a_4288_8877_d3bb326d6ab8.slice/crio-conmon-1bd0daa14677dca29a5735cf02c16303275e8a1fa562cbb2b1718094a32aff39.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23fd02e3_eaab_4e25_b9f7_74bee38551e7.slice/crio-conmon-428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ddc5896_100d_473a_9bed_a2e13560bc8e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9bdd80a_b1f9_42a7_8a60_4b514c778d4b.slice/crio-1f35e0a19e16d354d0d307af91eb9ff4a9822d1d603edd82dfb4e7ada4325ca5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23fd02e3_eaab_4e25_b9f7_74bee38551e7.slice/crio-428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfde693e_68da_45d5_803a_c8a0e36a198a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d21dfc7_f906_45e6_99f3_bac942be8ed9.slice/crio-efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8c66c5d_0081_4f94_8dd5_14489e07cdfd.slice/crio-52e69d4b39894fc868d122406c6b22a54343b5547d5600a7ebbacdd2837dbbe8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8c66c5d_0081_4f94_8dd5_14489e07cdfd.slice/crio-conmon-af0748440baecaa648ebd533a3049e48c9a6f1a9f67e9e388cc8feb47bf674c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfde693e_68da_45d5_803a_c8a0e36a198a.slice/crio-conmon-3cb9146b384da2fc97aabd6065964b95908546eb74a18b646ed19772282e5130.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ddc5896_100d_473a_9bed_a2e13560bc8e.slice/crio-1a536e397ec042d75db0e603a6432f8790589fbffd70e293d7157f7ad85826d2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfde693e_68da_45d5_803a_c8a0e36a198a.slice/crio-ccb4280319de17ed85411472b25e27102b342c359fa2bedf9781b5c0f67ac1c7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8c66c5d_0081_4f94_8dd5_14489e07cdfd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d21dfc7_f906_45e6_99f3_bac942be8ed9.slice/crio-conmon-efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9bdd80a_b1f9_42a7_8a60_4b514c778d4b.slice/crio-f74a87f1cecc4623678dccd7290ad1495faa6e21b97a730dfb96eead82927acc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfde693e_68da_45d5_803a_c8a0e36a198a.slice/crio-conmon-3e31d5168665dd1bbdc361ae4db6fe75589cb8a019033458c739cdca0e68ea85.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f03545_c45a_4288_8877_d3bb326d6ab8.slice/crio-bc3a0d29d6b23882965039455f0b4cc54004c8d48edf4070bf446e01ba2c7da8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9bdd80a_b1f9_42a7_8a60_4b514c778d4b.slice/crio-conmon-3302fc64d606d801c8ab964a09669063d18348509884ccbfa03b0c9f173ef416.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d21dfc7_f906_45e6_99f3_bac942be8ed9.slice/crio-350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfde693e_68da_45d5_803a_c8a0e36a198a.slice/crio-3e31d5168665dd1bbdc361ae4db6fe75589cb8a019033458c739cdca0e68ea85.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d21dfc7_f906_45e6_99f3_bac942be8ed9.slice/crio-conmon-350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16.scope\": RecentStats: unable to find data in memory cache]" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.841699 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.858106 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-scripts\") pod \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.858508 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-config-data\") pod \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.858601 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fd02e3-eaab-4e25-b9f7-74bee38551e7-logs\") pod \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.858740 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nts6\" (UniqueName: \"kubernetes.io/projected/23fd02e3-eaab-4e25-b9f7-74bee38551e7-kube-api-access-2nts6\") pod \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.859135 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23fd02e3-eaab-4e25-b9f7-74bee38551e7-horizon-secret-key\") pod \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\" (UID: \"23fd02e3-eaab-4e25-b9f7-74bee38551e7\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.860793 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23fd02e3-eaab-4e25-b9f7-74bee38551e7-logs" (OuterVolumeSpecName: "logs") pod "23fd02e3-eaab-4e25-b9f7-74bee38551e7" (UID: "23fd02e3-eaab-4e25-b9f7-74bee38551e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.861910 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fd02e3-eaab-4e25-b9f7-74bee38551e7-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.865990 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fd02e3-eaab-4e25-b9f7-74bee38551e7-kube-api-access-2nts6" (OuterVolumeSpecName: "kube-api-access-2nts6") pod "23fd02e3-eaab-4e25-b9f7-74bee38551e7" (UID: "23fd02e3-eaab-4e25-b9f7-74bee38551e7"). InnerVolumeSpecName "kube-api-access-2nts6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.868982 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fd02e3-eaab-4e25-b9f7-74bee38551e7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "23fd02e3-eaab-4e25-b9f7-74bee38551e7" (UID: "23fd02e3-eaab-4e25-b9f7-74bee38551e7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.880639 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-config-data" (OuterVolumeSpecName: "config-data") pod "23fd02e3-eaab-4e25-b9f7-74bee38551e7" (UID: "23fd02e3-eaab-4e25-b9f7-74bee38551e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.882526 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-scripts" (OuterVolumeSpecName: "scripts") pod "23fd02e3-eaab-4e25-b9f7-74bee38551e7" (UID: "23fd02e3-eaab-4e25-b9f7-74bee38551e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.935905 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.963371 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-config-data\") pod \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.963532 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mctt\" (UniqueName: \"kubernetes.io/projected/8d21dfc7-f906-45e6-99f3-bac942be8ed9-kube-api-access-5mctt\") pod \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.963638 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-scripts\") pod \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.963779 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d21dfc7-f906-45e6-99f3-bac942be8ed9-logs\") pod \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.964106 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d21dfc7-f906-45e6-99f3-bac942be8ed9-horizon-secret-key\") pod \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\" (UID: \"8d21dfc7-f906-45e6-99f3-bac942be8ed9\") " Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.964251 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d21dfc7-f906-45e6-99f3-bac942be8ed9-logs" (OuterVolumeSpecName: "logs") pod "8d21dfc7-f906-45e6-99f3-bac942be8ed9" (UID: "8d21dfc7-f906-45e6-99f3-bac942be8ed9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.965090 4834 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23fd02e3-eaab-4e25-b9f7-74bee38551e7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.965115 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d21dfc7-f906-45e6-99f3-bac942be8ed9-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.965130 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.965143 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23fd02e3-eaab-4e25-b9f7-74bee38551e7-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.965156 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nts6\" (UniqueName: \"kubernetes.io/projected/23fd02e3-eaab-4e25-b9f7-74bee38551e7-kube-api-access-2nts6\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.969100 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d21dfc7-f906-45e6-99f3-bac942be8ed9-kube-api-access-5mctt" (OuterVolumeSpecName: "kube-api-access-5mctt") pod "8d21dfc7-f906-45e6-99f3-bac942be8ed9" (UID: "8d21dfc7-f906-45e6-99f3-bac942be8ed9"). InnerVolumeSpecName "kube-api-access-5mctt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.970236 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d21dfc7-f906-45e6-99f3-bac942be8ed9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8d21dfc7-f906-45e6-99f3-bac942be8ed9" (UID: "8d21dfc7-f906-45e6-99f3-bac942be8ed9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.983695 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-scripts" (OuterVolumeSpecName: "scripts") pod "8d21dfc7-f906-45e6-99f3-bac942be8ed9" (UID: "8d21dfc7-f906-45e6-99f3-bac942be8ed9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:56 crc kubenswrapper[4834]: I1126 12:26:56.985973 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-config-data" (OuterVolumeSpecName: "config-data") pod "8d21dfc7-f906-45e6-99f3-bac942be8ed9" (UID: "8d21dfc7-f906-45e6-99f3-bac942be8ed9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.067868 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.067900 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mctt\" (UniqueName: \"kubernetes.io/projected/8d21dfc7-f906-45e6-99f3-bac942be8ed9-kube-api-access-5mctt\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.067914 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d21dfc7-f906-45e6-99f3-bac942be8ed9-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.067923 4834 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d21dfc7-f906-45e6-99f3-bac942be8ed9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.299766 4834 generic.go:334] "Generic (PLEG): container finished" podID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerID="428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36" exitCode=137 Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.300491 4834 generic.go:334] "Generic (PLEG): container finished" podID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerID="c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87" exitCode=137 Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.299831 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b64d5bd45-4kjkh" event={"ID":"23fd02e3-eaab-4e25-b9f7-74bee38551e7","Type":"ContainerDied","Data":"428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36"} Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.300592 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b64d5bd45-4kjkh" event={"ID":"23fd02e3-eaab-4e25-b9f7-74bee38551e7","Type":"ContainerDied","Data":"c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87"} Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.299885 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b64d5bd45-4kjkh" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.300628 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b64d5bd45-4kjkh" event={"ID":"23fd02e3-eaab-4e25-b9f7-74bee38551e7","Type":"ContainerDied","Data":"d10f3a858c8af537e7a547adc7d536435f17a543ea71254c55f13f3b334d845c"} Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.300654 4834 scope.go:117] "RemoveContainer" containerID="428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.303547 4834 generic.go:334] "Generic (PLEG): container finished" podID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerID="efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1" exitCode=137 Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.303584 4834 generic.go:334] "Generic (PLEG): container finished" podID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerID="350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16" exitCode=137 Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.303615 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86d6455c5c-px2dk" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.303594 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86d6455c5c-px2dk" event={"ID":"8d21dfc7-f906-45e6-99f3-bac942be8ed9","Type":"ContainerDied","Data":"efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1"} Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.303778 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86d6455c5c-px2dk" event={"ID":"8d21dfc7-f906-45e6-99f3-bac942be8ed9","Type":"ContainerDied","Data":"350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16"} Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.303801 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86d6455c5c-px2dk" event={"ID":"8d21dfc7-f906-45e6-99f3-bac942be8ed9","Type":"ContainerDied","Data":"f7b60804d3afcbafda6dc7d27f983da354074942aa94c41490d7e7bdb380725b"} Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.304203 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wlvcf" podUID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerName="registry-server" containerID="cri-o://135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea" gracePeriod=2 Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.335871 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b64d5bd45-4kjkh"] Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.343706 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b64d5bd45-4kjkh"] Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.348428 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86d6455c5c-px2dk"] Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.352817 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86d6455c5c-px2dk"] Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.483519 4834 scope.go:117] "RemoveContainer" containerID="c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.504055 4834 scope.go:117] "RemoveContainer" containerID="428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36" Nov 26 12:26:57 crc kubenswrapper[4834]: E1126 12:26:57.504464 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36\": container with ID starting with 428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36 not found: ID does not exist" containerID="428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.504504 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36"} err="failed to get container status \"428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36\": rpc error: code = NotFound desc = could not find container \"428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36\": container with ID starting with 428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36 not found: ID does not exist" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.504528 4834 scope.go:117] "RemoveContainer" containerID="c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87" Nov 26 12:26:57 crc kubenswrapper[4834]: E1126 12:26:57.504860 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87\": container with ID starting with c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87 not found: ID does not exist" containerID="c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.504932 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87"} err="failed to get container status \"c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87\": rpc error: code = NotFound desc = could not find container \"c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87\": container with ID starting with c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87 not found: ID does not exist" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.504986 4834 scope.go:117] "RemoveContainer" containerID="428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.505407 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36"} err="failed to get container status \"428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36\": rpc error: code = NotFound desc = could not find container \"428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36\": container with ID starting with 428d351bb10024c2ecaeb2a24e57725a763c87f00f3203e7ab8d10344af9ea36 not found: ID does not exist" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.505434 4834 scope.go:117] "RemoveContainer" containerID="c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.505677 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87"} err="failed to get container status \"c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87\": rpc error: code = NotFound desc = could not find container \"c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87\": container with ID starting with c64b3691b596f990bb221dc4563e67aeabda598a6836043f67132ea7431f2f87 not found: ID does not exist" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.505712 4834 scope.go:117] "RemoveContainer" containerID="efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.656463 4834 scope.go:117] "RemoveContainer" containerID="350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.721009 4834 scope.go:117] "RemoveContainer" containerID="efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1" Nov 26 12:26:57 crc kubenswrapper[4834]: E1126 12:26:57.721584 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1\": container with ID starting with efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1 not found: ID does not exist" containerID="efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.721629 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1"} err="failed to get container status \"efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1\": rpc error: code = NotFound desc = could not find container \"efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1\": container with ID starting with efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1 not found: ID does not exist" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.721658 4834 scope.go:117] "RemoveContainer" containerID="350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16" Nov 26 12:26:57 crc kubenswrapper[4834]: E1126 12:26:57.722136 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16\": container with ID starting with 350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16 not found: ID does not exist" containerID="350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.722166 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16"} err="failed to get container status \"350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16\": rpc error: code = NotFound desc = could not find container \"350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16\": container with ID starting with 350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16 not found: ID does not exist" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.722183 4834 scope.go:117] "RemoveContainer" containerID="efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.722546 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1"} err="failed to get container status \"efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1\": rpc error: code = NotFound desc = could not find container \"efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1\": container with ID starting with efe43037acd9c5d2bfd7a68519bbff8aaf5991ad284caaf79d2f6366ffddc5e1 not found: ID does not exist" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.722588 4834 scope.go:117] "RemoveContainer" containerID="350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.722808 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16"} err="failed to get container status \"350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16\": rpc error: code = NotFound desc = could not find container \"350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16\": container with ID starting with 350d46c5bfd854aaf191757e5faa2fc9132316ee658679520b79b6398374ca16 not found: ID does not exist" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.756190 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.781814 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5vlj\" (UniqueName: \"kubernetes.io/projected/19845428-2c8a-43df-98f9-def4fc4e7a84-kube-api-access-m5vlj\") pod \"19845428-2c8a-43df-98f9-def4fc4e7a84\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.781899 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-utilities\") pod \"19845428-2c8a-43df-98f9-def4fc4e7a84\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.782043 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-catalog-content\") pod \"19845428-2c8a-43df-98f9-def4fc4e7a84\" (UID: \"19845428-2c8a-43df-98f9-def4fc4e7a84\") " Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.787426 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19845428-2c8a-43df-98f9-def4fc4e7a84-kube-api-access-m5vlj" (OuterVolumeSpecName: "kube-api-access-m5vlj") pod "19845428-2c8a-43df-98f9-def4fc4e7a84" (UID: "19845428-2c8a-43df-98f9-def4fc4e7a84"). InnerVolumeSpecName "kube-api-access-m5vlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.787569 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-utilities" (OuterVolumeSpecName: "utilities") pod "19845428-2c8a-43df-98f9-def4fc4e7a84" (UID: "19845428-2c8a-43df-98f9-def4fc4e7a84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.796022 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19845428-2c8a-43df-98f9-def4fc4e7a84" (UID: "19845428-2c8a-43df-98f9-def4fc4e7a84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.886034 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5vlj\" (UniqueName: \"kubernetes.io/projected/19845428-2c8a-43df-98f9-def4fc4e7a84-kube-api-access-m5vlj\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.886068 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:57 crc kubenswrapper[4834]: I1126 12:26:57.886081 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19845428-2c8a-43df-98f9-def4fc4e7a84-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.314613 4834 generic.go:334] "Generic (PLEG): container finished" podID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerID="135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea" exitCode=0 Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.314698 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlvcf" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.314699 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlvcf" event={"ID":"19845428-2c8a-43df-98f9-def4fc4e7a84","Type":"ContainerDied","Data":"135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea"} Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.314841 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlvcf" event={"ID":"19845428-2c8a-43df-98f9-def4fc4e7a84","Type":"ContainerDied","Data":"8fe1db87f766ab422e47facb47036042eca5666e73d9ce79b29358c265906384"} Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.314876 4834 scope.go:117] "RemoveContainer" containerID="135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.320465 4834 generic.go:334] "Generic (PLEG): container finished" podID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerID="9e4fcaac3592dd2cfcbe81a7173beea0340ff06e304762077d856046e28d7416" exitCode=0 Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.320518 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64b956f958-2k5zd" event={"ID":"4de7fa58-f514-49a8-88b0-205ef138c8a3","Type":"ContainerDied","Data":"9e4fcaac3592dd2cfcbe81a7173beea0340ff06e304762077d856046e28d7416"} Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.342469 4834 scope.go:117] "RemoveContainer" containerID="a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.345891 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlvcf"] Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.353406 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlvcf"] Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.361307 4834 scope.go:117] "RemoveContainer" containerID="4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.393223 4834 scope.go:117] "RemoveContainer" containerID="135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea" Nov 26 12:26:58 crc kubenswrapper[4834]: E1126 12:26:58.394698 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea\": container with ID starting with 135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea not found: ID does not exist" containerID="135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.394769 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea"} err="failed to get container status \"135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea\": rpc error: code = NotFound desc = could not find container \"135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea\": container with ID starting with 135d655f8935666000b3494c46fb3d7af0393916830c3d4e70533669dcd5f4ea not found: ID does not exist" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.394806 4834 scope.go:117] "RemoveContainer" containerID="a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465" Nov 26 12:26:58 crc kubenswrapper[4834]: E1126 12:26:58.395198 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465\": container with ID starting with a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465 not found: ID does not exist" containerID="a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.395249 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465"} err="failed to get container status \"a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465\": rpc error: code = NotFound desc = could not find container \"a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465\": container with ID starting with a753b0edaa51b41ea2df97d919e5ccd022783ad023903734bd9057c4599dc465 not found: ID does not exist" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.395284 4834 scope.go:117] "RemoveContainer" containerID="4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353" Nov 26 12:26:58 crc kubenswrapper[4834]: E1126 12:26:58.395631 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353\": container with ID starting with 4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353 not found: ID does not exist" containerID="4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.395679 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353"} err="failed to get container status \"4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353\": rpc error: code = NotFound desc = could not find container \"4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353\": container with ID starting with 4649f08e9c74ac773d06825aed33390d4354d51411e4011a7d9f4f272843f353 not found: ID does not exist" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.426697 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19845428-2c8a-43df-98f9-def4fc4e7a84" path="/var/lib/kubelet/pods/19845428-2c8a-43df-98f9-def4fc4e7a84/volumes" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.427435 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" path="/var/lib/kubelet/pods/23fd02e3-eaab-4e25-b9f7-74bee38551e7/volumes" Nov 26 12:26:58 crc kubenswrapper[4834]: I1126 12:26:58.428025 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" path="/var/lib/kubelet/pods/8d21dfc7-f906-45e6-99f3-bac942be8ed9/volumes" Nov 26 12:26:59 crc kubenswrapper[4834]: I1126 12:26:59.839338 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:26:59 crc kubenswrapper[4834]: I1126 12:26:59.891616 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c65849c7f-n2g6v"] Nov 26 12:26:59 crc kubenswrapper[4834]: I1126 12:26:59.891882 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" podUID="90315a1e-440b-414d-af26-a31c178faf53" containerName="dnsmasq-dns" containerID="cri-o://d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534" gracePeriod=10 Nov 26 12:26:59 crc kubenswrapper[4834]: I1126 12:26:59.971077 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.031432 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.336441 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.348298 4834 generic.go:334] "Generic (PLEG): container finished" podID="90315a1e-440b-414d-af26-a31c178faf53" containerID="d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534" exitCode=0 Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.348430 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.348436 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" event={"ID":"90315a1e-440b-414d-af26-a31c178faf53","Type":"ContainerDied","Data":"d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534"} Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.348523 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c65849c7f-n2g6v" event={"ID":"90315a1e-440b-414d-af26-a31c178faf53","Type":"ContainerDied","Data":"57977a60b69710b7a3f5f1baf1e488f82e7dc8ddd92adb307fe3b8e9a5030734"} Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.348551 4834 scope.go:117] "RemoveContainer" containerID="d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.348769 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerName="cinder-scheduler" containerID="cri-o://0af6fbc528da14750d78fc5ae6a22c310a7be054dbb6d32f6e4fc4b8fa853e8e" gracePeriod=30 Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.348794 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerName="probe" containerID="cri-o://90c5dbe11347bc7e18488697192007690a99c7ecc9f00c441a2f7ffb56c79440" gracePeriod=30 Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.354554 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64b956f958-2k5zd" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.135:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.135:8443: connect: connection refused" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.390503 4834 scope.go:117] "RemoveContainer" containerID="1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.421134 4834 scope.go:117] "RemoveContainer" containerID="d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534" Nov 26 12:27:00 crc kubenswrapper[4834]: E1126 12:27:00.421759 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534\": container with ID starting with d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534 not found: ID does not exist" containerID="d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.421819 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534"} err="failed to get container status \"d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534\": rpc error: code = NotFound desc = could not find container \"d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534\": container with ID starting with d26b759500c2eac9104324531d243b2687feed34931f4b1fb0c128d42349f534 not found: ID does not exist" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.421853 4834 scope.go:117] "RemoveContainer" containerID="1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8" Nov 26 12:27:00 crc kubenswrapper[4834]: E1126 12:27:00.422128 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8\": container with ID starting with 1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8 not found: ID does not exist" containerID="1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.422158 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8"} err="failed to get container status \"1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8\": rpc error: code = NotFound desc = could not find container \"1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8\": container with ID starting with 1b66e104ec98fe9c494516a0ae5957c699cba87a8a98d79f5b36df4391da59d8 not found: ID does not exist" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.440920 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srzrm\" (UniqueName: \"kubernetes.io/projected/90315a1e-440b-414d-af26-a31c178faf53-kube-api-access-srzrm\") pod \"90315a1e-440b-414d-af26-a31c178faf53\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.441055 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-sb\") pod \"90315a1e-440b-414d-af26-a31c178faf53\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.441365 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-config\") pod \"90315a1e-440b-414d-af26-a31c178faf53\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.441457 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-nb\") pod \"90315a1e-440b-414d-af26-a31c178faf53\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.441490 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-dns-svc\") pod \"90315a1e-440b-414d-af26-a31c178faf53\" (UID: \"90315a1e-440b-414d-af26-a31c178faf53\") " Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.446911 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90315a1e-440b-414d-af26-a31c178faf53-kube-api-access-srzrm" (OuterVolumeSpecName: "kube-api-access-srzrm") pod "90315a1e-440b-414d-af26-a31c178faf53" (UID: "90315a1e-440b-414d-af26-a31c178faf53"). InnerVolumeSpecName "kube-api-access-srzrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.481634 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90315a1e-440b-414d-af26-a31c178faf53" (UID: "90315a1e-440b-414d-af26-a31c178faf53"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.481644 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-config" (OuterVolumeSpecName: "config") pod "90315a1e-440b-414d-af26-a31c178faf53" (UID: "90315a1e-440b-414d-af26-a31c178faf53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.481880 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90315a1e-440b-414d-af26-a31c178faf53" (UID: "90315a1e-440b-414d-af26-a31c178faf53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.484570 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90315a1e-440b-414d-af26-a31c178faf53" (UID: "90315a1e-440b-414d-af26-a31c178faf53"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.544361 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.544393 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.544405 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srzrm\" (UniqueName: \"kubernetes.io/projected/90315a1e-440b-414d-af26-a31c178faf53-kube-api-access-srzrm\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.544417 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.544426 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90315a1e-440b-414d-af26-a31c178faf53-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.692029 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c65849c7f-n2g6v"] Nov 26 12:27:00 crc kubenswrapper[4834]: I1126 12:27:00.697869 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c65849c7f-n2g6v"] Nov 26 12:27:01 crc kubenswrapper[4834]: I1126 12:27:01.366045 4834 generic.go:334] "Generic (PLEG): container finished" podID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerID="90c5dbe11347bc7e18488697192007690a99c7ecc9f00c441a2f7ffb56c79440" exitCode=0 Nov 26 12:27:01 crc kubenswrapper[4834]: I1126 12:27:01.366140 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"75be0f1c-a91a-428a-a8a0-7cab480d8c2a","Type":"ContainerDied","Data":"90c5dbe11347bc7e18488697192007690a99c7ecc9f00c441a2f7ffb56c79440"} Nov 26 12:27:02 crc kubenswrapper[4834]: I1126 12:27:02.425382 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90315a1e-440b-414d-af26-a31c178faf53" path="/var/lib/kubelet/pods/90315a1e-440b-414d-af26-a31c178faf53/volumes" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.384033 4834 generic.go:334] "Generic (PLEG): container finished" podID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerID="0af6fbc528da14750d78fc5ae6a22c310a7be054dbb6d32f6e4fc4b8fa853e8e" exitCode=0 Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.384334 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"75be0f1c-a91a-428a-a8a0-7cab480d8c2a","Type":"ContainerDied","Data":"0af6fbc528da14750d78fc5ae6a22c310a7be054dbb6d32f6e4fc4b8fa853e8e"} Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.384365 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"75be0f1c-a91a-428a-a8a0-7cab480d8c2a","Type":"ContainerDied","Data":"53c6a089ea0298398d5d054c62b91f6d870a3a212684885e61344938ac4e603b"} Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.384378 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53c6a089ea0298398d5d054c62b91f6d870a3a212684885e61344938ac4e603b" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.398232 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.499072 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cfrl\" (UniqueName: \"kubernetes.io/projected/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-kube-api-access-7cfrl\") pod \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.499121 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data-custom\") pod \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.499327 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-scripts\") pod \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.499364 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-combined-ca-bundle\") pod \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.499399 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-etc-machine-id\") pod \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.499533 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data\") pod \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\" (UID: \"75be0f1c-a91a-428a-a8a0-7cab480d8c2a\") " Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.500800 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "75be0f1c-a91a-428a-a8a0-7cab480d8c2a" (UID: "75be0f1c-a91a-428a-a8a0-7cab480d8c2a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.508445 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "75be0f1c-a91a-428a-a8a0-7cab480d8c2a" (UID: "75be0f1c-a91a-428a-a8a0-7cab480d8c2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.508536 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-kube-api-access-7cfrl" (OuterVolumeSpecName: "kube-api-access-7cfrl") pod "75be0f1c-a91a-428a-a8a0-7cab480d8c2a" (UID: "75be0f1c-a91a-428a-a8a0-7cab480d8c2a"). InnerVolumeSpecName "kube-api-access-7cfrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.511465 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-scripts" (OuterVolumeSpecName: "scripts") pod "75be0f1c-a91a-428a-a8a0-7cab480d8c2a" (UID: "75be0f1c-a91a-428a-a8a0-7cab480d8c2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.538561 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75be0f1c-a91a-428a-a8a0-7cab480d8c2a" (UID: "75be0f1c-a91a-428a-a8a0-7cab480d8c2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.566728 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data" (OuterVolumeSpecName: "config-data") pod "75be0f1c-a91a-428a-a8a0-7cab480d8c2a" (UID: "75be0f1c-a91a-428a-a8a0-7cab480d8c2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.602103 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.602132 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.602144 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cfrl\" (UniqueName: \"kubernetes.io/projected/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-kube-api-access-7cfrl\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.602154 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.602162 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:03 crc kubenswrapper[4834]: I1126 12:27:03.602171 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75be0f1c-a91a-428a-a8a0-7cab480d8c2a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.391590 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.393636 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-595cbdb8c4-fwh2n" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.451425 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.457990 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.470855 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471275 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471295 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471329 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerName="probe" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471336 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerName="probe" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471352 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90315a1e-440b-414d-af26-a31c178faf53" containerName="init" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471357 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="90315a1e-440b-414d-af26-a31c178faf53" containerName="init" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471369 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerName="cinder-scheduler" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471374 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerName="cinder-scheduler" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471383 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerName="horizon-log" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471389 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerName="horizon-log" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471396 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerName="registry-server" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471401 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerName="registry-server" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471453 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerName="extract-utilities" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471460 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerName="extract-utilities" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471469 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerName="horizon" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471474 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerName="horizon" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471482 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerName="extract-content" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471488 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerName="extract-content" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471501 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90315a1e-440b-414d-af26-a31c178faf53" containerName="dnsmasq-dns" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471507 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="90315a1e-440b-414d-af26-a31c178faf53" containerName="dnsmasq-dns" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471518 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerName="horizon" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471523 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerName="horizon" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471531 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerName="horizon-log" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471536 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerName="horizon-log" Nov 26 12:27:04 crc kubenswrapper[4834]: E1126 12:27:04.471545 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api-log" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471550 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api-log" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471697 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api-log" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471709 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerName="horizon-log" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471717 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="90315a1e-440b-414d-af26-a31c178faf53" containerName="dnsmasq-dns" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471725 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="19845428-2c8a-43df-98f9-def4fc4e7a84" containerName="registry-server" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471732 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerName="horizon" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471746 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9bdd80a-b1f9-42a7-8a60-4b514c778d4b" containerName="barbican-api" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471754 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerName="cinder-scheduler" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471765 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d21dfc7-f906-45e6-99f3-bac942be8ed9" containerName="horizon-log" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471775 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" containerName="probe" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.471784 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fd02e3-eaab-4e25-b9f7-74bee38551e7" containerName="horizon" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.472650 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.476619 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.496447 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.522694 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h59zk\" (UniqueName: \"kubernetes.io/projected/60fea7de-82b0-4107-b0cb-07f09bbd2341-kube-api-access-h59zk\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.522791 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60fea7de-82b0-4107-b0cb-07f09bbd2341-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.522865 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.522885 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.522901 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-config-data\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.523085 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-scripts\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.624861 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60fea7de-82b0-4107-b0cb-07f09bbd2341-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.625028 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.625066 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.625087 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-config-data\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.625150 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-scripts\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.625195 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h59zk\" (UniqueName: \"kubernetes.io/projected/60fea7de-82b0-4107-b0cb-07f09bbd2341-kube-api-access-h59zk\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.625022 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60fea7de-82b0-4107-b0cb-07f09bbd2341-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.631033 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-scripts\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.631575 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-config-data\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.637903 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.639160 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60fea7de-82b0-4107-b0cb-07f09bbd2341-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.646727 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h59zk\" (UniqueName: \"kubernetes.io/projected/60fea7de-82b0-4107-b0cb-07f09bbd2341-kube-api-access-h59zk\") pod \"cinder-scheduler-0\" (UID: \"60fea7de-82b0-4107-b0cb-07f09bbd2341\") " pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.759756 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.760945 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.763261 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.763516 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-d9gt4" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.763662 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.768100 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.793850 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.834731 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/61d6f7eb-6f11-4c69-b31f-75701ac020c2-openstack-config-secret\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.834788 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d6f7eb-6f11-4c69-b31f-75701ac020c2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.834828 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/61d6f7eb-6f11-4c69-b31f-75701ac020c2-openstack-config\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.834866 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk78x\" (UniqueName: \"kubernetes.io/projected/61d6f7eb-6f11-4c69-b31f-75701ac020c2-kube-api-access-jk78x\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.936454 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d6f7eb-6f11-4c69-b31f-75701ac020c2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.936745 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/61d6f7eb-6f11-4c69-b31f-75701ac020c2-openstack-config\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.936795 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk78x\" (UniqueName: \"kubernetes.io/projected/61d6f7eb-6f11-4c69-b31f-75701ac020c2-kube-api-access-jk78x\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.936876 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/61d6f7eb-6f11-4c69-b31f-75701ac020c2-openstack-config-secret\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.938002 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/61d6f7eb-6f11-4c69-b31f-75701ac020c2-openstack-config\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.940260 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/61d6f7eb-6f11-4c69-b31f-75701ac020c2-openstack-config-secret\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.941322 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d6f7eb-6f11-4c69-b31f-75701ac020c2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:04 crc kubenswrapper[4834]: I1126 12:27:04.953025 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk78x\" (UniqueName: \"kubernetes.io/projected/61d6f7eb-6f11-4c69-b31f-75701ac020c2-kube-api-access-jk78x\") pod \"openstackclient\" (UID: \"61d6f7eb-6f11-4c69-b31f-75701ac020c2\") " pod="openstack/openstackclient" Nov 26 12:27:05 crc kubenswrapper[4834]: I1126 12:27:05.077749 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 12:27:05 crc kubenswrapper[4834]: I1126 12:27:05.213573 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 12:27:05 crc kubenswrapper[4834]: I1126 12:27:05.409041 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60fea7de-82b0-4107-b0cb-07f09bbd2341","Type":"ContainerStarted","Data":"7dd891587e7eb6b63c83313f12cd375d862e309230cea6271b40af7c369ab6f1"} Nov 26 12:27:05 crc kubenswrapper[4834]: I1126 12:27:05.467400 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 12:27:05 crc kubenswrapper[4834]: W1126 12:27:05.474768 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61d6f7eb_6f11_4c69_b31f_75701ac020c2.slice/crio-738c9e54cf33bb2701a47afd2e34681bf8e235433ef7669edc36080c1e1adf2d WatchSource:0}: Error finding container 738c9e54cf33bb2701a47afd2e34681bf8e235433ef7669edc36080c1e1adf2d: Status 404 returned error can't find the container with id 738c9e54cf33bb2701a47afd2e34681bf8e235433ef7669edc36080c1e1adf2d Nov 26 12:27:06 crc kubenswrapper[4834]: I1126 12:27:06.299493 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 26 12:27:06 crc kubenswrapper[4834]: I1126 12:27:06.434489 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75be0f1c-a91a-428a-a8a0-7cab480d8c2a" path="/var/lib/kubelet/pods/75be0f1c-a91a-428a-a8a0-7cab480d8c2a/volumes" Nov 26 12:27:06 crc kubenswrapper[4834]: I1126 12:27:06.435808 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60fea7de-82b0-4107-b0cb-07f09bbd2341","Type":"ContainerStarted","Data":"40dba360e7ab80eb98b7a0da98df267ea286deaa6da947ee9c997765fb7ffd4d"} Nov 26 12:27:06 crc kubenswrapper[4834]: I1126 12:27:06.435865 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60fea7de-82b0-4107-b0cb-07f09bbd2341","Type":"ContainerStarted","Data":"6baf6926d77c85dff8b4ec2d0dcfd1751398c70f0590a2e920a522d5edc954b0"} Nov 26 12:27:06 crc kubenswrapper[4834]: I1126 12:27:06.456352 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.456336705 podStartE2EDuration="2.456336705s" podCreationTimestamp="2025-11-26 12:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:27:06.451908104 +0000 UTC m=+924.359121456" watchObservedRunningTime="2025-11-26 12:27:06.456336705 +0000 UTC m=+924.363550057" Nov 26 12:27:06 crc kubenswrapper[4834]: I1126 12:27:06.456415 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"61d6f7eb-6f11-4c69-b31f-75701ac020c2","Type":"ContainerStarted","Data":"738c9e54cf33bb2701a47afd2e34681bf8e235433ef7669edc36080c1e1adf2d"} Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.214549 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qjndv"] Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.216297 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.232769 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qjndv"] Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.241732 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.259135 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8685c85bd8-7kkqj" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.302409 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-utilities\") pod \"community-operators-qjndv\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.302462 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-catalog-content\") pod \"community-operators-qjndv\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.302615 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlrwr\" (UniqueName: \"kubernetes.io/projected/266b74b0-d7b4-4ce4-b411-6b0fbc048895-kube-api-access-nlrwr\") pod \"community-operators-qjndv\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.403842 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlrwr\" (UniqueName: \"kubernetes.io/projected/266b74b0-d7b4-4ce4-b411-6b0fbc048895-kube-api-access-nlrwr\") pod \"community-operators-qjndv\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.404145 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-utilities\") pod \"community-operators-qjndv\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.404181 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-catalog-content\") pod \"community-operators-qjndv\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.405976 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-catalog-content\") pod \"community-operators-qjndv\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.406016 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-utilities\") pod \"community-operators-qjndv\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.422104 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlrwr\" (UniqueName: \"kubernetes.io/projected/266b74b0-d7b4-4ce4-b411-6b0fbc048895-kube-api-access-nlrwr\") pod \"community-operators-qjndv\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:07 crc kubenswrapper[4834]: I1126 12:27:07.557644 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.031610 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qjndv"] Nov 26 12:27:08 crc kubenswrapper[4834]: W1126 12:27:08.058164 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod266b74b0_d7b4_4ce4_b411_6b0fbc048895.slice/crio-5b9b719975c2d93ff14fafe166d53b3cba8b02099a71a1d41f3026c7ee013817 WatchSource:0}: Error finding container 5b9b719975c2d93ff14fafe166d53b3cba8b02099a71a1d41f3026c7ee013817: Status 404 returned error can't find the container with id 5b9b719975c2d93ff14fafe166d53b3cba8b02099a71a1d41f3026c7ee013817 Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.330754 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6jvd5"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.332013 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.374988 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6jvd5"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.425889 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc5f3da7-3847-4e39-b7d6-908f8de8740c-operator-scripts\") pod \"nova-api-db-create-6jvd5\" (UID: \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\") " pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.426006 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s5bl\" (UniqueName: \"kubernetes.io/projected/bc5f3da7-3847-4e39-b7d6-908f8de8740c-kube-api-access-2s5bl\") pod \"nova-api-db-create-6jvd5\" (UID: \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\") " pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.449141 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-696dw"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.450350 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.454418 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-696dw"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.461703 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-96ae-account-create-update-8pxqn"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.462865 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.465992 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.477532 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-96ae-account-create-update-8pxqn"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.486949 4834 generic.go:334] "Generic (PLEG): container finished" podID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerID="93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7" exitCode=0 Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.486988 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjndv" event={"ID":"266b74b0-d7b4-4ce4-b411-6b0fbc048895","Type":"ContainerDied","Data":"93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7"} Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.487011 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjndv" event={"ID":"266b74b0-d7b4-4ce4-b411-6b0fbc048895","Type":"ContainerStarted","Data":"5b9b719975c2d93ff14fafe166d53b3cba8b02099a71a1d41f3026c7ee013817"} Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.528248 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-operator-scripts\") pod \"nova-api-96ae-account-create-update-8pxqn\" (UID: \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\") " pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.528483 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fwzs\" (UniqueName: \"kubernetes.io/projected/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-kube-api-access-7fwzs\") pod \"nova-api-96ae-account-create-update-8pxqn\" (UID: \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\") " pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.528748 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc5f3da7-3847-4e39-b7d6-908f8de8740c-operator-scripts\") pod \"nova-api-db-create-6jvd5\" (UID: \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\") " pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.528809 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c623e2b3-2d04-4579-bbf2-fe1aecc98118-operator-scripts\") pod \"nova-cell0-db-create-696dw\" (UID: \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\") " pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.528973 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s5bl\" (UniqueName: \"kubernetes.io/projected/bc5f3da7-3847-4e39-b7d6-908f8de8740c-kube-api-access-2s5bl\") pod \"nova-api-db-create-6jvd5\" (UID: \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\") " pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.529009 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mb9h\" (UniqueName: \"kubernetes.io/projected/c623e2b3-2d04-4579-bbf2-fe1aecc98118-kube-api-access-7mb9h\") pod \"nova-cell0-db-create-696dw\" (UID: \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\") " pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.529877 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc5f3da7-3847-4e39-b7d6-908f8de8740c-operator-scripts\") pod \"nova-api-db-create-6jvd5\" (UID: \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\") " pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.548614 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s5bl\" (UniqueName: \"kubernetes.io/projected/bc5f3da7-3847-4e39-b7d6-908f8de8740c-kube-api-access-2s5bl\") pod \"nova-api-db-create-6jvd5\" (UID: \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\") " pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.630710 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-operator-scripts\") pod \"nova-api-96ae-account-create-update-8pxqn\" (UID: \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\") " pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.630803 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fwzs\" (UniqueName: \"kubernetes.io/projected/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-kube-api-access-7fwzs\") pod \"nova-api-96ae-account-create-update-8pxqn\" (UID: \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\") " pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.630848 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c623e2b3-2d04-4579-bbf2-fe1aecc98118-operator-scripts\") pod \"nova-cell0-db-create-696dw\" (UID: \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\") " pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.630903 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mb9h\" (UniqueName: \"kubernetes.io/projected/c623e2b3-2d04-4579-bbf2-fe1aecc98118-kube-api-access-7mb9h\") pod \"nova-cell0-db-create-696dw\" (UID: \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\") " pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.631635 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c623e2b3-2d04-4579-bbf2-fe1aecc98118-operator-scripts\") pod \"nova-cell0-db-create-696dw\" (UID: \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\") " pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.631705 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-operator-scripts\") pod \"nova-api-96ae-account-create-update-8pxqn\" (UID: \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\") " pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.640747 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kdm8m"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.654074 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.655015 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f898-account-create-update-2b85n"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.656023 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.660948 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.661179 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mb9h\" (UniqueName: \"kubernetes.io/projected/c623e2b3-2d04-4579-bbf2-fe1aecc98118-kube-api-access-7mb9h\") pod \"nova-cell0-db-create-696dw\" (UID: \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\") " pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.661285 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.662259 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fwzs\" (UniqueName: \"kubernetes.io/projected/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-kube-api-access-7fwzs\") pod \"nova-api-96ae-account-create-update-8pxqn\" (UID: \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\") " pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.665860 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kdm8m"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.680398 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f898-account-create-update-2b85n"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.741244 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d582581d-dd1b-48dc-8356-f911726bf78e-operator-scripts\") pod \"nova-cell0-f898-account-create-update-2b85n\" (UID: \"d582581d-dd1b-48dc-8356-f911726bf78e\") " pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.741550 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8xrz\" (UniqueName: \"kubernetes.io/projected/23ea3d77-ce69-4f54-877e-56e1a9c9183d-kube-api-access-c8xrz\") pod \"nova-cell1-db-create-kdm8m\" (UID: \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\") " pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.741607 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lpfm\" (UniqueName: \"kubernetes.io/projected/d582581d-dd1b-48dc-8356-f911726bf78e-kube-api-access-6lpfm\") pod \"nova-cell0-f898-account-create-update-2b85n\" (UID: \"d582581d-dd1b-48dc-8356-f911726bf78e\") " pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.741642 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ea3d77-ce69-4f54-877e-56e1a9c9183d-operator-scripts\") pod \"nova-cell1-db-create-kdm8m\" (UID: \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\") " pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.764996 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.780774 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.843201 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d582581d-dd1b-48dc-8356-f911726bf78e-operator-scripts\") pod \"nova-cell0-f898-account-create-update-2b85n\" (UID: \"d582581d-dd1b-48dc-8356-f911726bf78e\") " pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.843268 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8xrz\" (UniqueName: \"kubernetes.io/projected/23ea3d77-ce69-4f54-877e-56e1a9c9183d-kube-api-access-c8xrz\") pod \"nova-cell1-db-create-kdm8m\" (UID: \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\") " pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.843354 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lpfm\" (UniqueName: \"kubernetes.io/projected/d582581d-dd1b-48dc-8356-f911726bf78e-kube-api-access-6lpfm\") pod \"nova-cell0-f898-account-create-update-2b85n\" (UID: \"d582581d-dd1b-48dc-8356-f911726bf78e\") " pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.843396 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ea3d77-ce69-4f54-877e-56e1a9c9183d-operator-scripts\") pod \"nova-cell1-db-create-kdm8m\" (UID: \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\") " pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.844122 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ea3d77-ce69-4f54-877e-56e1a9c9183d-operator-scripts\") pod \"nova-cell1-db-create-kdm8m\" (UID: \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\") " pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.845524 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d582581d-dd1b-48dc-8356-f911726bf78e-operator-scripts\") pod \"nova-cell0-f898-account-create-update-2b85n\" (UID: \"d582581d-dd1b-48dc-8356-f911726bf78e\") " pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.864344 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-fecf-account-create-update-rbc66"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.864768 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lpfm\" (UniqueName: \"kubernetes.io/projected/d582581d-dd1b-48dc-8356-f911726bf78e-kube-api-access-6lpfm\") pod \"nova-cell0-f898-account-create-update-2b85n\" (UID: \"d582581d-dd1b-48dc-8356-f911726bf78e\") " pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.868575 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.871065 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.881152 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8xrz\" (UniqueName: \"kubernetes.io/projected/23ea3d77-ce69-4f54-877e-56e1a9c9183d-kube-api-access-c8xrz\") pod \"nova-cell1-db-create-kdm8m\" (UID: \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\") " pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.881200 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fecf-account-create-update-rbc66"] Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.946729 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvhjc\" (UniqueName: \"kubernetes.io/projected/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-kube-api-access-pvhjc\") pod \"nova-cell1-fecf-account-create-update-rbc66\" (UID: \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\") " pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:08 crc kubenswrapper[4834]: I1126 12:27:08.946865 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-operator-scripts\") pod \"nova-cell1-fecf-account-create-update-rbc66\" (UID: \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\") " pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.048897 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvhjc\" (UniqueName: \"kubernetes.io/projected/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-kube-api-access-pvhjc\") pod \"nova-cell1-fecf-account-create-update-rbc66\" (UID: \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\") " pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.049097 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-operator-scripts\") pod \"nova-cell1-fecf-account-create-update-rbc66\" (UID: \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\") " pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.050191 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-operator-scripts\") pod \"nova-cell1-fecf-account-create-update-rbc66\" (UID: \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\") " pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.063006 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvhjc\" (UniqueName: \"kubernetes.io/projected/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-kube-api-access-pvhjc\") pod \"nova-cell1-fecf-account-create-update-rbc66\" (UID: \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\") " pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.080511 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.124064 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6jvd5"] Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.131072 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.195470 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.280271 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-696dw"] Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.289833 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-96ae-account-create-update-8pxqn"] Nov 26 12:27:09 crc kubenswrapper[4834]: W1126 12:27:09.297809 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc623e2b3_2d04_4579_bbf2_fe1aecc98118.slice/crio-88d86a82196e67d16f81ac230739a9248ace49be124bb70504ded9052e396fd6 WatchSource:0}: Error finding container 88d86a82196e67d16f81ac230739a9248ace49be124bb70504ded9052e396fd6: Status 404 returned error can't find the container with id 88d86a82196e67d16f81ac230739a9248ace49be124bb70504ded9052e396fd6 Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.506626 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjndv" event={"ID":"266b74b0-d7b4-4ce4-b411-6b0fbc048895","Type":"ContainerStarted","Data":"26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460"} Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.508551 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-696dw" event={"ID":"c623e2b3-2d04-4579-bbf2-fe1aecc98118","Type":"ContainerStarted","Data":"88d86a82196e67d16f81ac230739a9248ace49be124bb70504ded9052e396fd6"} Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.511158 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6jvd5" event={"ID":"bc5f3da7-3847-4e39-b7d6-908f8de8740c","Type":"ContainerStarted","Data":"778784c4f5c14b5e5c3b18c0961907593fdfc7d78ac0e2cae4afda7faa6c1b4b"} Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.511188 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6jvd5" event={"ID":"bc5f3da7-3847-4e39-b7d6-908f8de8740c","Type":"ContainerStarted","Data":"4d5953951319dcd27a43c39c8df2d102ce58ab57072e3941fd2aa09de772d7ac"} Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.520675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-96ae-account-create-update-8pxqn" event={"ID":"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4","Type":"ContainerStarted","Data":"d85b75b77650a7f0b61cde22b43dda4aa8963631b76a43a92777ca6f7f227752"} Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.520702 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-96ae-account-create-update-8pxqn" event={"ID":"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4","Type":"ContainerStarted","Data":"9b0b364e07903fcec9371d1b843635f9adcd7bfd21726a827db9e7af08e4062a"} Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.546242 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f898-account-create-update-2b85n"] Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.551122 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-96ae-account-create-update-8pxqn" podStartSLOduration=1.551101478 podStartE2EDuration="1.551101478s" podCreationTimestamp="2025-11-26 12:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:27:09.536761923 +0000 UTC m=+927.443975275" watchObservedRunningTime="2025-11-26 12:27:09.551101478 +0000 UTC m=+927.458314829" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.557536 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-6jvd5" podStartSLOduration=1.557523512 podStartE2EDuration="1.557523512s" podCreationTimestamp="2025-11-26 12:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:27:09.549030086 +0000 UTC m=+927.456243439" watchObservedRunningTime="2025-11-26 12:27:09.557523512 +0000 UTC m=+927.464736865" Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.619865 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kdm8m"] Nov 26 12:27:09 crc kubenswrapper[4834]: W1126 12:27:09.639049 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23ea3d77_ce69_4f54_877e_56e1a9c9183d.slice/crio-f8ffe32cac77a56bb4c23c78873f92dba65cbf429372b1d44e72da45265526bc WatchSource:0}: Error finding container f8ffe32cac77a56bb4c23c78873f92dba65cbf429372b1d44e72da45265526bc: Status 404 returned error can't find the container with id f8ffe32cac77a56bb4c23c78873f92dba65cbf429372b1d44e72da45265526bc Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.700058 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fecf-account-create-update-rbc66"] Nov 26 12:27:09 crc kubenswrapper[4834]: I1126 12:27:09.794555 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.351120 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64b956f958-2k5zd" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.135:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.135:8443: connect: connection refused" Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.530955 4834 generic.go:334] "Generic (PLEG): container finished" podID="d582581d-dd1b-48dc-8356-f911726bf78e" containerID="40e3be5605fd15f174e3be8cd4912cdadd09e826064b88ce16cd1b29ade9a15b" exitCode=0 Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.531100 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f898-account-create-update-2b85n" event={"ID":"d582581d-dd1b-48dc-8356-f911726bf78e","Type":"ContainerDied","Data":"40e3be5605fd15f174e3be8cd4912cdadd09e826064b88ce16cd1b29ade9a15b"} Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.531154 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f898-account-create-update-2b85n" event={"ID":"d582581d-dd1b-48dc-8356-f911726bf78e","Type":"ContainerStarted","Data":"4505d4a1db0f7d560e315f37866864e477dcc9a4344072bff608c012143b4e69"} Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.533132 4834 generic.go:334] "Generic (PLEG): container finished" podID="c623e2b3-2d04-4579-bbf2-fe1aecc98118" containerID="9ed195b735780020224948fd1aa2e26a0c4a55439af451293798f9648ffa3046" exitCode=0 Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.533190 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-696dw" event={"ID":"c623e2b3-2d04-4579-bbf2-fe1aecc98118","Type":"ContainerDied","Data":"9ed195b735780020224948fd1aa2e26a0c4a55439af451293798f9648ffa3046"} Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.535503 4834 generic.go:334] "Generic (PLEG): container finished" podID="bc5f3da7-3847-4e39-b7d6-908f8de8740c" containerID="778784c4f5c14b5e5c3b18c0961907593fdfc7d78ac0e2cae4afda7faa6c1b4b" exitCode=0 Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.535585 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6jvd5" event={"ID":"bc5f3da7-3847-4e39-b7d6-908f8de8740c","Type":"ContainerDied","Data":"778784c4f5c14b5e5c3b18c0961907593fdfc7d78ac0e2cae4afda7faa6c1b4b"} Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.537334 4834 generic.go:334] "Generic (PLEG): container finished" podID="80ab7f94-6d9b-4ac8-a50a-9332f240ddf4" containerID="d85b75b77650a7f0b61cde22b43dda4aa8963631b76a43a92777ca6f7f227752" exitCode=0 Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.537389 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-96ae-account-create-update-8pxqn" event={"ID":"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4","Type":"ContainerDied","Data":"d85b75b77650a7f0b61cde22b43dda4aa8963631b76a43a92777ca6f7f227752"} Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.544560 4834 generic.go:334] "Generic (PLEG): container finished" podID="23ea3d77-ce69-4f54-877e-56e1a9c9183d" containerID="586c00aebc9cd36d3de7a558343e02479269bdedd2bef06bdcee8c8f62d001e8" exitCode=0 Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.544649 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kdm8m" event={"ID":"23ea3d77-ce69-4f54-877e-56e1a9c9183d","Type":"ContainerDied","Data":"586c00aebc9cd36d3de7a558343e02479269bdedd2bef06bdcee8c8f62d001e8"} Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.544679 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kdm8m" event={"ID":"23ea3d77-ce69-4f54-877e-56e1a9c9183d","Type":"ContainerStarted","Data":"f8ffe32cac77a56bb4c23c78873f92dba65cbf429372b1d44e72da45265526bc"} Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.566591 4834 generic.go:334] "Generic (PLEG): container finished" podID="93841b51-b2ec-473c-ae5d-f7f652ba6aa7" containerID="748d665e660676e17ae4ee029f0a9abd2545dc41c8b5fae14feade09b0a33fed" exitCode=0 Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.566702 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fecf-account-create-update-rbc66" event={"ID":"93841b51-b2ec-473c-ae5d-f7f652ba6aa7","Type":"ContainerDied","Data":"748d665e660676e17ae4ee029f0a9abd2545dc41c8b5fae14feade09b0a33fed"} Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.566739 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fecf-account-create-update-rbc66" event={"ID":"93841b51-b2ec-473c-ae5d-f7f652ba6aa7","Type":"ContainerStarted","Data":"b3058a8849281a8e78b3f56773f54ed76cbddaa3300373aff27f9be61a67d309"} Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.573724 4834 generic.go:334] "Generic (PLEG): container finished" podID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerID="26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460" exitCode=0 Nov 26 12:27:10 crc kubenswrapper[4834]: I1126 12:27:10.574423 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjndv" event={"ID":"266b74b0-d7b4-4ce4-b411-6b0fbc048895","Type":"ContainerDied","Data":"26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460"} Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.018400 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.470379 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.501351 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.514902 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s5bl\" (UniqueName: \"kubernetes.io/projected/bc5f3da7-3847-4e39-b7d6-908f8de8740c-kube-api-access-2s5bl\") pod \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\" (UID: \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.515188 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc5f3da7-3847-4e39-b7d6-908f8de8740c-operator-scripts\") pod \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\" (UID: \"bc5f3da7-3847-4e39-b7d6-908f8de8740c\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.517513 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc5f3da7-3847-4e39-b7d6-908f8de8740c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc5f3da7-3847-4e39-b7d6-908f8de8740c" (UID: "bc5f3da7-3847-4e39-b7d6-908f8de8740c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.521844 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5f3da7-3847-4e39-b7d6-908f8de8740c-kube-api-access-2s5bl" (OuterVolumeSpecName: "kube-api-access-2s5bl") pod "bc5f3da7-3847-4e39-b7d6-908f8de8740c" (UID: "bc5f3da7-3847-4e39-b7d6-908f8de8740c"). InnerVolumeSpecName "kube-api-access-2s5bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.531697 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.548802 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.561947 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.577157 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618465 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8xrz\" (UniqueName: \"kubernetes.io/projected/23ea3d77-ce69-4f54-877e-56e1a9c9183d-kube-api-access-c8xrz\") pod \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\" (UID: \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618567 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fwzs\" (UniqueName: \"kubernetes.io/projected/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-kube-api-access-7fwzs\") pod \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\" (UID: \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618590 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvhjc\" (UniqueName: \"kubernetes.io/projected/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-kube-api-access-pvhjc\") pod \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\" (UID: \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618633 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mb9h\" (UniqueName: \"kubernetes.io/projected/c623e2b3-2d04-4579-bbf2-fe1aecc98118-kube-api-access-7mb9h\") pod \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\" (UID: \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618664 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-operator-scripts\") pod \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\" (UID: \"93841b51-b2ec-473c-ae5d-f7f652ba6aa7\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618732 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c623e2b3-2d04-4579-bbf2-fe1aecc98118-operator-scripts\") pod \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\" (UID: \"c623e2b3-2d04-4579-bbf2-fe1aecc98118\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618815 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d582581d-dd1b-48dc-8356-f911726bf78e-operator-scripts\") pod \"d582581d-dd1b-48dc-8356-f911726bf78e\" (UID: \"d582581d-dd1b-48dc-8356-f911726bf78e\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618846 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ea3d77-ce69-4f54-877e-56e1a9c9183d-operator-scripts\") pod \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\" (UID: \"23ea3d77-ce69-4f54-877e-56e1a9c9183d\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618872 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-operator-scripts\") pod \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\" (UID: \"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.618913 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lpfm\" (UniqueName: \"kubernetes.io/projected/d582581d-dd1b-48dc-8356-f911726bf78e-kube-api-access-6lpfm\") pod \"d582581d-dd1b-48dc-8356-f911726bf78e\" (UID: \"d582581d-dd1b-48dc-8356-f911726bf78e\") " Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.619201 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc5f3da7-3847-4e39-b7d6-908f8de8740c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.619213 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s5bl\" (UniqueName: \"kubernetes.io/projected/bc5f3da7-3847-4e39-b7d6-908f8de8740c-kube-api-access-2s5bl\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.619283 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c623e2b3-2d04-4579-bbf2-fe1aecc98118-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c623e2b3-2d04-4579-bbf2-fe1aecc98118" (UID: "c623e2b3-2d04-4579-bbf2-fe1aecc98118"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.619666 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93841b51-b2ec-473c-ae5d-f7f652ba6aa7" (UID: "93841b51-b2ec-473c-ae5d-f7f652ba6aa7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.619734 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ea3d77-ce69-4f54-877e-56e1a9c9183d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23ea3d77-ce69-4f54-877e-56e1a9c9183d" (UID: "23ea3d77-ce69-4f54-877e-56e1a9c9183d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.619937 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80ab7f94-6d9b-4ac8-a50a-9332f240ddf4" (UID: "80ab7f94-6d9b-4ac8-a50a-9332f240ddf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.621803 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d582581d-dd1b-48dc-8356-f911726bf78e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d582581d-dd1b-48dc-8356-f911726bf78e" (UID: "d582581d-dd1b-48dc-8356-f911726bf78e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.622431 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-kube-api-access-pvhjc" (OuterVolumeSpecName: "kube-api-access-pvhjc") pod "93841b51-b2ec-473c-ae5d-f7f652ba6aa7" (UID: "93841b51-b2ec-473c-ae5d-f7f652ba6aa7"). InnerVolumeSpecName "kube-api-access-pvhjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.622643 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ea3d77-ce69-4f54-877e-56e1a9c9183d-kube-api-access-c8xrz" (OuterVolumeSpecName: "kube-api-access-c8xrz") pod "23ea3d77-ce69-4f54-877e-56e1a9c9183d" (UID: "23ea3d77-ce69-4f54-877e-56e1a9c9183d"). InnerVolumeSpecName "kube-api-access-c8xrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.623020 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d582581d-dd1b-48dc-8356-f911726bf78e-kube-api-access-6lpfm" (OuterVolumeSpecName: "kube-api-access-6lpfm") pod "d582581d-dd1b-48dc-8356-f911726bf78e" (UID: "d582581d-dd1b-48dc-8356-f911726bf78e"). InnerVolumeSpecName "kube-api-access-6lpfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.623389 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-kube-api-access-7fwzs" (OuterVolumeSpecName: "kube-api-access-7fwzs") pod "80ab7f94-6d9b-4ac8-a50a-9332f240ddf4" (UID: "80ab7f94-6d9b-4ac8-a50a-9332f240ddf4"). InnerVolumeSpecName "kube-api-access-7fwzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.623912 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c623e2b3-2d04-4579-bbf2-fe1aecc98118-kube-api-access-7mb9h" (OuterVolumeSpecName: "kube-api-access-7mb9h") pod "c623e2b3-2d04-4579-bbf2-fe1aecc98118" (UID: "c623e2b3-2d04-4579-bbf2-fe1aecc98118"). InnerVolumeSpecName "kube-api-access-7mb9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.648644 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kdm8m" event={"ID":"23ea3d77-ce69-4f54-877e-56e1a9c9183d","Type":"ContainerDied","Data":"f8ffe32cac77a56bb4c23c78873f92dba65cbf429372b1d44e72da45265526bc"} Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.648688 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8ffe32cac77a56bb4c23c78873f92dba65cbf429372b1d44e72da45265526bc" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.648755 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kdm8m" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.650914 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fecf-account-create-update-rbc66" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.651419 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fecf-account-create-update-rbc66" event={"ID":"93841b51-b2ec-473c-ae5d-f7f652ba6aa7","Type":"ContainerDied","Data":"b3058a8849281a8e78b3f56773f54ed76cbddaa3300373aff27f9be61a67d309"} Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.651464 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3058a8849281a8e78b3f56773f54ed76cbddaa3300373aff27f9be61a67d309" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.653415 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjndv" event={"ID":"266b74b0-d7b4-4ce4-b411-6b0fbc048895","Type":"ContainerStarted","Data":"2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75"} Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.655788 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f898-account-create-update-2b85n" event={"ID":"d582581d-dd1b-48dc-8356-f911726bf78e","Type":"ContainerDied","Data":"4505d4a1db0f7d560e315f37866864e477dcc9a4344072bff608c012143b4e69"} Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.655837 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4505d4a1db0f7d560e315f37866864e477dcc9a4344072bff608c012143b4e69" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.655913 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f898-account-create-update-2b85n" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.667108 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-696dw" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.667122 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-696dw" event={"ID":"c623e2b3-2d04-4579-bbf2-fe1aecc98118","Type":"ContainerDied","Data":"88d86a82196e67d16f81ac230739a9248ace49be124bb70504ded9052e396fd6"} Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.667306 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88d86a82196e67d16f81ac230739a9248ace49be124bb70504ded9052e396fd6" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.672664 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6jvd5" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.672774 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6jvd5" event={"ID":"bc5f3da7-3847-4e39-b7d6-908f8de8740c","Type":"ContainerDied","Data":"4d5953951319dcd27a43c39c8df2d102ce58ab57072e3941fd2aa09de772d7ac"} Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.673705 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d5953951319dcd27a43c39c8df2d102ce58ab57072e3941fd2aa09de772d7ac" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.675342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"61d6f7eb-6f11-4c69-b31f-75701ac020c2","Type":"ContainerStarted","Data":"e29616e2877b0b5bf53f76a5edddfab49d37f2a119243346f7ee5fba7689ae2b"} Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.677576 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-96ae-account-create-update-8pxqn" event={"ID":"80ab7f94-6d9b-4ac8-a50a-9332f240ddf4","Type":"ContainerDied","Data":"9b0b364e07903fcec9371d1b843635f9adcd7bfd21726a827db9e7af08e4062a"} Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.677622 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b0b364e07903fcec9371d1b843635f9adcd7bfd21726a827db9e7af08e4062a" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.677705 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-96ae-account-create-update-8pxqn" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.692540 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qjndv" podStartSLOduration=1.929542026 podStartE2EDuration="8.692527531s" podCreationTimestamp="2025-11-26 12:27:07 +0000 UTC" firstStartedPulling="2025-11-26 12:27:08.496385741 +0000 UTC m=+926.403599093" lastFinishedPulling="2025-11-26 12:27:15.259371256 +0000 UTC m=+933.166584598" observedRunningTime="2025-11-26 12:27:15.680857964 +0000 UTC m=+933.588071316" watchObservedRunningTime="2025-11-26 12:27:15.692527531 +0000 UTC m=+933.599740884" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.708071 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.899213176 podStartE2EDuration="11.708054874s" podCreationTimestamp="2025-11-26 12:27:04 +0000 UTC" firstStartedPulling="2025-11-26 12:27:05.4764431 +0000 UTC m=+923.383656452" lastFinishedPulling="2025-11-26 12:27:15.285284798 +0000 UTC m=+933.192498150" observedRunningTime="2025-11-26 12:27:15.694203477 +0000 UTC m=+933.601416829" watchObservedRunningTime="2025-11-26 12:27:15.708054874 +0000 UTC m=+933.615268225" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721160 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721186 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lpfm\" (UniqueName: \"kubernetes.io/projected/d582581d-dd1b-48dc-8356-f911726bf78e-kube-api-access-6lpfm\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721196 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8xrz\" (UniqueName: \"kubernetes.io/projected/23ea3d77-ce69-4f54-877e-56e1a9c9183d-kube-api-access-c8xrz\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721205 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fwzs\" (UniqueName: \"kubernetes.io/projected/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4-kube-api-access-7fwzs\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721215 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvhjc\" (UniqueName: \"kubernetes.io/projected/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-kube-api-access-pvhjc\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721236 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mb9h\" (UniqueName: \"kubernetes.io/projected/c623e2b3-2d04-4579-bbf2-fe1aecc98118-kube-api-access-7mb9h\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721244 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93841b51-b2ec-473c-ae5d-f7f652ba6aa7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721255 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c623e2b3-2d04-4579-bbf2-fe1aecc98118-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721265 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d582581d-dd1b-48dc-8356-f911726bf78e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:15 crc kubenswrapper[4834]: I1126 12:27:15.721274 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ea3d77-ce69-4f54-877e-56e1a9c9183d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:17 crc kubenswrapper[4834]: I1126 12:27:17.527322 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 12:27:17 crc kubenswrapper[4834]: I1126 12:27:17.557941 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:17 crc kubenswrapper[4834]: I1126 12:27:17.557984 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:17 crc kubenswrapper[4834]: I1126 12:27:17.599788 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.364651 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.364903 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="ceilometer-central-agent" containerID="cri-o://9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5" gracePeriod=30 Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.365005 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="proxy-httpd" containerID="cri-o://4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479" gracePeriod=30 Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.365047 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="sg-core" containerID="cri-o://ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2" gracePeriod=30 Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.365033 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="ceilometer-notification-agent" containerID="cri-o://b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a" gracePeriod=30 Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.719855 4834 generic.go:334] "Generic (PLEG): container finished" podID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerID="4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479" exitCode=0 Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.720277 4834 generic.go:334] "Generic (PLEG): container finished" podID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerID="ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2" exitCode=2 Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.720286 4834 generic.go:334] "Generic (PLEG): container finished" podID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerID="9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5" exitCode=0 Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.719945 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerDied","Data":"4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479"} Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.720994 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerDied","Data":"ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2"} Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.721058 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerDied","Data":"9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5"} Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.898105 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x7vq5"] Nov 26 12:27:18 crc kubenswrapper[4834]: E1126 12:27:18.898705 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ea3d77-ce69-4f54-877e-56e1a9c9183d" containerName="mariadb-database-create" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.898767 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ea3d77-ce69-4f54-877e-56e1a9c9183d" containerName="mariadb-database-create" Nov 26 12:27:18 crc kubenswrapper[4834]: E1126 12:27:18.898817 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d582581d-dd1b-48dc-8356-f911726bf78e" containerName="mariadb-account-create-update" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.898875 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d582581d-dd1b-48dc-8356-f911726bf78e" containerName="mariadb-account-create-update" Nov 26 12:27:18 crc kubenswrapper[4834]: E1126 12:27:18.898926 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c623e2b3-2d04-4579-bbf2-fe1aecc98118" containerName="mariadb-database-create" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.898967 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c623e2b3-2d04-4579-bbf2-fe1aecc98118" containerName="mariadb-database-create" Nov 26 12:27:18 crc kubenswrapper[4834]: E1126 12:27:18.899013 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ab7f94-6d9b-4ac8-a50a-9332f240ddf4" containerName="mariadb-account-create-update" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.899052 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ab7f94-6d9b-4ac8-a50a-9332f240ddf4" containerName="mariadb-account-create-update" Nov 26 12:27:18 crc kubenswrapper[4834]: E1126 12:27:18.899110 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5f3da7-3847-4e39-b7d6-908f8de8740c" containerName="mariadb-database-create" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.899160 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5f3da7-3847-4e39-b7d6-908f8de8740c" containerName="mariadb-database-create" Nov 26 12:27:18 crc kubenswrapper[4834]: E1126 12:27:18.899208 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93841b51-b2ec-473c-ae5d-f7f652ba6aa7" containerName="mariadb-account-create-update" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.899262 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="93841b51-b2ec-473c-ae5d-f7f652ba6aa7" containerName="mariadb-account-create-update" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.899464 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d582581d-dd1b-48dc-8356-f911726bf78e" containerName="mariadb-account-create-update" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.899525 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c623e2b3-2d04-4579-bbf2-fe1aecc98118" containerName="mariadb-database-create" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.899572 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5f3da7-3847-4e39-b7d6-908f8de8740c" containerName="mariadb-database-create" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.899626 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="93841b51-b2ec-473c-ae5d-f7f652ba6aa7" containerName="mariadb-account-create-update" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.899673 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ea3d77-ce69-4f54-877e-56e1a9c9183d" containerName="mariadb-database-create" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.899721 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ab7f94-6d9b-4ac8-a50a-9332f240ddf4" containerName="mariadb-account-create-update" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.900234 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.901849 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.902861 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-r8sxf" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.904031 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.913871 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x7vq5"] Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.988909 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-scripts\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.988992 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9sdb\" (UniqueName: \"kubernetes.io/projected/a2eb7a31-4805-49e5-81cc-58208e57f440-kube-api-access-g9sdb\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.989047 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-config-data\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:18 crc kubenswrapper[4834]: I1126 12:27:18.989064 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.091563 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-scripts\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.091617 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9sdb\" (UniqueName: \"kubernetes.io/projected/a2eb7a31-4805-49e5-81cc-58208e57f440-kube-api-access-g9sdb\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.091680 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-config-data\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.091704 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.100888 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.101700 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-config-data\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.102130 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-scripts\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.113852 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9sdb\" (UniqueName: \"kubernetes.io/projected/a2eb7a31-4805-49e5-81cc-58208e57f440-kube-api-access-g9sdb\") pod \"nova-cell0-conductor-db-sync-x7vq5\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.215774 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.620655 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x7vq5"] Nov 26 12:27:19 crc kubenswrapper[4834]: I1126 12:27:19.738325 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x7vq5" event={"ID":"a2eb7a31-4805-49e5-81cc-58208e57f440","Type":"ContainerStarted","Data":"9f69a68cb551a4f9bb9a8fc4dfee84d1ffa927111338e0a2de772e716d7b1e88"} Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.287911 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.351944 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64b956f958-2k5zd" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.135:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.135:8443: connect: connection refused" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.352370 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.419633 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wzsf\" (UniqueName: \"kubernetes.io/projected/f32f5f3a-b8f8-459a-891b-91aad92910c2-kube-api-access-6wzsf\") pod \"f32f5f3a-b8f8-459a-891b-91aad92910c2\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.419722 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-config-data\") pod \"f32f5f3a-b8f8-459a-891b-91aad92910c2\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.419824 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-combined-ca-bundle\") pod \"f32f5f3a-b8f8-459a-891b-91aad92910c2\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.419876 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-scripts\") pod \"f32f5f3a-b8f8-459a-891b-91aad92910c2\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.420015 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-log-httpd\") pod \"f32f5f3a-b8f8-459a-891b-91aad92910c2\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.420051 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-sg-core-conf-yaml\") pod \"f32f5f3a-b8f8-459a-891b-91aad92910c2\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.420389 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-run-httpd\") pod \"f32f5f3a-b8f8-459a-891b-91aad92910c2\" (UID: \"f32f5f3a-b8f8-459a-891b-91aad92910c2\") " Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.420799 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f32f5f3a-b8f8-459a-891b-91aad92910c2" (UID: "f32f5f3a-b8f8-459a-891b-91aad92910c2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.421053 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.421033 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f32f5f3a-b8f8-459a-891b-91aad92910c2" (UID: "f32f5f3a-b8f8-459a-891b-91aad92910c2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.435451 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-scripts" (OuterVolumeSpecName: "scripts") pod "f32f5f3a-b8f8-459a-891b-91aad92910c2" (UID: "f32f5f3a-b8f8-459a-891b-91aad92910c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.435569 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32f5f3a-b8f8-459a-891b-91aad92910c2-kube-api-access-6wzsf" (OuterVolumeSpecName: "kube-api-access-6wzsf") pod "f32f5f3a-b8f8-459a-891b-91aad92910c2" (UID: "f32f5f3a-b8f8-459a-891b-91aad92910c2"). InnerVolumeSpecName "kube-api-access-6wzsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.451576 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f32f5f3a-b8f8-459a-891b-91aad92910c2" (UID: "f32f5f3a-b8f8-459a-891b-91aad92910c2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.490331 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f32f5f3a-b8f8-459a-891b-91aad92910c2" (UID: "f32f5f3a-b8f8-459a-891b-91aad92910c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.500838 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-config-data" (OuterVolumeSpecName: "config-data") pod "f32f5f3a-b8f8-459a-891b-91aad92910c2" (UID: "f32f5f3a-b8f8-459a-891b-91aad92910c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.522663 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.522692 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f32f5f3a-b8f8-459a-891b-91aad92910c2-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.522703 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wzsf\" (UniqueName: \"kubernetes.io/projected/f32f5f3a-b8f8-459a-891b-91aad92910c2-kube-api-access-6wzsf\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.522715 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.522724 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.522732 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f5f3a-b8f8-459a-891b-91aad92910c2-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.769465 4834 generic.go:334] "Generic (PLEG): container finished" podID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerID="b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a" exitCode=0 Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.769517 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerDied","Data":"b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a"} Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.769552 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f32f5f3a-b8f8-459a-891b-91aad92910c2","Type":"ContainerDied","Data":"0d049cf1fa9bb8e4c2fa346ae75b72579c28aa324d1f9568a7df46818435e31f"} Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.769575 4834 scope.go:117] "RemoveContainer" containerID="4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.769643 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.813270 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.829367 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.841647 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:20 crc kubenswrapper[4834]: E1126 12:27:20.842047 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="ceilometer-notification-agent" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.842065 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="ceilometer-notification-agent" Nov 26 12:27:20 crc kubenswrapper[4834]: E1126 12:27:20.842086 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="ceilometer-central-agent" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.842092 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="ceilometer-central-agent" Nov 26 12:27:20 crc kubenswrapper[4834]: E1126 12:27:20.842112 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="sg-core" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.842120 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="sg-core" Nov 26 12:27:20 crc kubenswrapper[4834]: E1126 12:27:20.842132 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="proxy-httpd" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.842138 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="proxy-httpd" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.842353 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="ceilometer-notification-agent" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.842374 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="sg-core" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.842387 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="ceilometer-central-agent" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.842399 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" containerName="proxy-httpd" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.844339 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.847894 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.848141 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.854644 4834 scope.go:117] "RemoveContainer" containerID="ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.869911 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.931734 4834 scope.go:117] "RemoveContainer" containerID="b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.936118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvn2k\" (UniqueName: \"kubernetes.io/projected/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-kube-api-access-jvn2k\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.936185 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-log-httpd\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.936233 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-config-data\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.936256 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-scripts\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.936335 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.936368 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-run-httpd\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.936400 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.959669 4834 scope.go:117] "RemoveContainer" containerID="9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.988391 4834 scope.go:117] "RemoveContainer" containerID="4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479" Nov 26 12:27:20 crc kubenswrapper[4834]: E1126 12:27:20.988694 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479\": container with ID starting with 4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479 not found: ID does not exist" containerID="4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.988726 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479"} err="failed to get container status \"4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479\": rpc error: code = NotFound desc = could not find container \"4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479\": container with ID starting with 4ed60d2c3eb84ea8ea080cc35bc98862f3f32abb28c57dfe7e34e3befacc9479 not found: ID does not exist" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.988761 4834 scope.go:117] "RemoveContainer" containerID="ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2" Nov 26 12:27:20 crc kubenswrapper[4834]: E1126 12:27:20.989155 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2\": container with ID starting with ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2 not found: ID does not exist" containerID="ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.989179 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2"} err="failed to get container status \"ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2\": rpc error: code = NotFound desc = could not find container \"ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2\": container with ID starting with ff9e1390835a4b30ce7bb57b592ea41cf90ea735c2a820e100569ad8f1aeafc2 not found: ID does not exist" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.989191 4834 scope.go:117] "RemoveContainer" containerID="b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a" Nov 26 12:27:20 crc kubenswrapper[4834]: E1126 12:27:20.989456 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a\": container with ID starting with b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a not found: ID does not exist" containerID="b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.989472 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a"} err="failed to get container status \"b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a\": rpc error: code = NotFound desc = could not find container \"b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a\": container with ID starting with b58229ab393dd4b10b5a950a35fe0ce443eba181a993ef805f5577520561a47a not found: ID does not exist" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.989485 4834 scope.go:117] "RemoveContainer" containerID="9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5" Nov 26 12:27:20 crc kubenswrapper[4834]: E1126 12:27:20.989965 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5\": container with ID starting with 9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5 not found: ID does not exist" containerID="9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5" Nov 26 12:27:20 crc kubenswrapper[4834]: I1126 12:27:20.989987 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5"} err="failed to get container status \"9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5\": rpc error: code = NotFound desc = could not find container \"9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5\": container with ID starting with 9cade85ac42207ddc9933338af54cdc4019b203591aeb8f1ee1f77e8d5bad1b5 not found: ID does not exist" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.037499 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvn2k\" (UniqueName: \"kubernetes.io/projected/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-kube-api-access-jvn2k\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.037553 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-log-httpd\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.037581 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-config-data\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.037606 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-scripts\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.037658 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.037688 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-run-httpd\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.037715 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.038606 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-log-httpd\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.039137 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-run-httpd\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.043233 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.043344 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.043823 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-scripts\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.044257 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-config-data\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.057451 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvn2k\" (UniqueName: \"kubernetes.io/projected/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-kube-api-access-jvn2k\") pod \"ceilometer-0\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.185000 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.530874 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.531180 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.531247 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.532144 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a30f1ec8aaa63fca565600131ca721203e6bad41e9594f352f74db2095a7e3eb"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.532215 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://a30f1ec8aaa63fca565600131ca721203e6bad41e9594f352f74db2095a7e3eb" gracePeriod=600 Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.618408 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:21 crc kubenswrapper[4834]: W1126 12:27:21.618842 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cb86dca_7a60_42ff_84ba_716e77c9b3c7.slice/crio-e82574d088ecb3a085c92a5a753366eb7c3d2cda5f9accdce96cf4ab287b1e2e WatchSource:0}: Error finding container e82574d088ecb3a085c92a5a753366eb7c3d2cda5f9accdce96cf4ab287b1e2e: Status 404 returned error can't find the container with id e82574d088ecb3a085c92a5a753366eb7c3d2cda5f9accdce96cf4ab287b1e2e Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.784945 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="a30f1ec8aaa63fca565600131ca721203e6bad41e9594f352f74db2095a7e3eb" exitCode=0 Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.785027 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"a30f1ec8aaa63fca565600131ca721203e6bad41e9594f352f74db2095a7e3eb"} Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.785300 4834 scope.go:117] "RemoveContainer" containerID="a287c3eb83da9750c870469d031aef3c2e315d099f1dc03a585c96cc0c81709e" Nov 26 12:27:21 crc kubenswrapper[4834]: I1126 12:27:21.787203 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerStarted","Data":"e82574d088ecb3a085c92a5a753366eb7c3d2cda5f9accdce96cf4ab287b1e2e"} Nov 26 12:27:22 crc kubenswrapper[4834]: I1126 12:27:22.346844 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:22 crc kubenswrapper[4834]: I1126 12:27:22.434683 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f32f5f3a-b8f8-459a-891b-91aad92910c2" path="/var/lib/kubelet/pods/f32f5f3a-b8f8-459a-891b-91aad92910c2/volumes" Nov 26 12:27:22 crc kubenswrapper[4834]: I1126 12:27:22.827887 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"5856a524b96de1a2b853af8b527df2165a8d3f4d203109c2d5ffa66afdcc60ed"} Nov 26 12:27:22 crc kubenswrapper[4834]: I1126 12:27:22.831738 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerStarted","Data":"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4"} Nov 26 12:27:24 crc kubenswrapper[4834]: I1126 12:27:24.849980 4834 generic.go:334] "Generic (PLEG): container finished" podID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerID="b1d2249bb255822f9a4b38553c1dc1bfbdc8ce15098e178e88a006df9ed337af" exitCode=137 Nov 26 12:27:24 crc kubenswrapper[4834]: I1126 12:27:24.850065 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64b956f958-2k5zd" event={"ID":"4de7fa58-f514-49a8-88b0-205ef138c8a3","Type":"ContainerDied","Data":"b1d2249bb255822f9a4b38553c1dc1bfbdc8ce15098e178e88a006df9ed337af"} Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.155870 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.286796 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-combined-ca-bundle\") pod \"4de7fa58-f514-49a8-88b0-205ef138c8a3\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.286874 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4de7fa58-f514-49a8-88b0-205ef138c8a3-logs\") pod \"4de7fa58-f514-49a8-88b0-205ef138c8a3\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.287166 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/4de7fa58-f514-49a8-88b0-205ef138c8a3-kube-api-access-gd9nn\") pod \"4de7fa58-f514-49a8-88b0-205ef138c8a3\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.287295 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-config-data\") pod \"4de7fa58-f514-49a8-88b0-205ef138c8a3\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.287426 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-tls-certs\") pod \"4de7fa58-f514-49a8-88b0-205ef138c8a3\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.287455 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-secret-key\") pod \"4de7fa58-f514-49a8-88b0-205ef138c8a3\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.287512 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-scripts\") pod \"4de7fa58-f514-49a8-88b0-205ef138c8a3\" (UID: \"4de7fa58-f514-49a8-88b0-205ef138c8a3\") " Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.292265 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de7fa58-f514-49a8-88b0-205ef138c8a3-logs" (OuterVolumeSpecName: "logs") pod "4de7fa58-f514-49a8-88b0-205ef138c8a3" (UID: "4de7fa58-f514-49a8-88b0-205ef138c8a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.295849 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de7fa58-f514-49a8-88b0-205ef138c8a3-kube-api-access-gd9nn" (OuterVolumeSpecName: "kube-api-access-gd9nn") pod "4de7fa58-f514-49a8-88b0-205ef138c8a3" (UID: "4de7fa58-f514-49a8-88b0-205ef138c8a3"). InnerVolumeSpecName "kube-api-access-gd9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.300637 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4de7fa58-f514-49a8-88b0-205ef138c8a3" (UID: "4de7fa58-f514-49a8-88b0-205ef138c8a3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.312807 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-scripts" (OuterVolumeSpecName: "scripts") pod "4de7fa58-f514-49a8-88b0-205ef138c8a3" (UID: "4de7fa58-f514-49a8-88b0-205ef138c8a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.321687 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-config-data" (OuterVolumeSpecName: "config-data") pod "4de7fa58-f514-49a8-88b0-205ef138c8a3" (UID: "4de7fa58-f514-49a8-88b0-205ef138c8a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.327458 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4de7fa58-f514-49a8-88b0-205ef138c8a3" (UID: "4de7fa58-f514-49a8-88b0-205ef138c8a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.335463 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "4de7fa58-f514-49a8-88b0-205ef138c8a3" (UID: "4de7fa58-f514-49a8-88b0-205ef138c8a3"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.389613 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/4de7fa58-f514-49a8-88b0-205ef138c8a3-kube-api-access-gd9nn\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.389645 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.389660 4834 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.389672 4834 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.389682 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4de7fa58-f514-49a8-88b0-205ef138c8a3-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.389695 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de7fa58-f514-49a8-88b0-205ef138c8a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.389704 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4de7fa58-f514-49a8-88b0-205ef138c8a3-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.597977 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.636634 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qjndv"] Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.909263 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x7vq5" event={"ID":"a2eb7a31-4805-49e5-81cc-58208e57f440","Type":"ContainerStarted","Data":"b6f58900e8721452ffca28c24f35a59f242fd9d8179dbe33d6c438d6f0c41d60"} Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.911989 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerStarted","Data":"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8"} Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.914114 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qjndv" podUID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerName="registry-server" containerID="cri-o://2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75" gracePeriod=2 Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.914418 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64b956f958-2k5zd" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.916266 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64b956f958-2k5zd" event={"ID":"4de7fa58-f514-49a8-88b0-205ef138c8a3","Type":"ContainerDied","Data":"5b2750ed77556e5c8d1be1639927631aa7b031dfe843ffcb9e1e9d42cc4551f1"} Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.916329 4834 scope.go:117] "RemoveContainer" containerID="9e4fcaac3592dd2cfcbe81a7173beea0340ff06e304762077d856046e28d7416" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.922917 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-x7vq5" podStartSLOduration=2.621477551 podStartE2EDuration="9.922901998s" podCreationTimestamp="2025-11-26 12:27:18 +0000 UTC" firstStartedPulling="2025-11-26 12:27:19.624795618 +0000 UTC m=+937.532008970" lastFinishedPulling="2025-11-26 12:27:26.926220065 +0000 UTC m=+944.833433417" observedRunningTime="2025-11-26 12:27:27.920570757 +0000 UTC m=+945.827784109" watchObservedRunningTime="2025-11-26 12:27:27.922901998 +0000 UTC m=+945.830115350" Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.953026 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64b956f958-2k5zd"] Nov 26 12:27:27 crc kubenswrapper[4834]: I1126 12:27:27.961678 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64b956f958-2k5zd"] Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.090443 4834 scope.go:117] "RemoveContainer" containerID="b1d2249bb255822f9a4b38553c1dc1bfbdc8ce15098e178e88a006df9ed337af" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.396852 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.426332 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" path="/var/lib/kubelet/pods/4de7fa58-f514-49a8-88b0-205ef138c8a3/volumes" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.511149 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-catalog-content\") pod \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.511275 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-utilities\") pod \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.511493 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlrwr\" (UniqueName: \"kubernetes.io/projected/266b74b0-d7b4-4ce4-b411-6b0fbc048895-kube-api-access-nlrwr\") pod \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\" (UID: \"266b74b0-d7b4-4ce4-b411-6b0fbc048895\") " Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.512035 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-utilities" (OuterVolumeSpecName: "utilities") pod "266b74b0-d7b4-4ce4-b411-6b0fbc048895" (UID: "266b74b0-d7b4-4ce4-b411-6b0fbc048895"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.516632 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/266b74b0-d7b4-4ce4-b411-6b0fbc048895-kube-api-access-nlrwr" (OuterVolumeSpecName: "kube-api-access-nlrwr") pod "266b74b0-d7b4-4ce4-b411-6b0fbc048895" (UID: "266b74b0-d7b4-4ce4-b411-6b0fbc048895"). InnerVolumeSpecName "kube-api-access-nlrwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.557529 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "266b74b0-d7b4-4ce4-b411-6b0fbc048895" (UID: "266b74b0-d7b4-4ce4-b411-6b0fbc048895"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.614237 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlrwr\" (UniqueName: \"kubernetes.io/projected/266b74b0-d7b4-4ce4-b411-6b0fbc048895-kube-api-access-nlrwr\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.614273 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.614287 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/266b74b0-d7b4-4ce4-b411-6b0fbc048895-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.923687 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerStarted","Data":"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314"} Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.926371 4834 generic.go:334] "Generic (PLEG): container finished" podID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerID="2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75" exitCode=0 Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.926429 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjndv" event={"ID":"266b74b0-d7b4-4ce4-b411-6b0fbc048895","Type":"ContainerDied","Data":"2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75"} Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.926451 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjndv" event={"ID":"266b74b0-d7b4-4ce4-b411-6b0fbc048895","Type":"ContainerDied","Data":"5b9b719975c2d93ff14fafe166d53b3cba8b02099a71a1d41f3026c7ee013817"} Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.926460 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjndv" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.926472 4834 scope.go:117] "RemoveContainer" containerID="2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.960726 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qjndv"] Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.963248 4834 scope.go:117] "RemoveContainer" containerID="26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460" Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.965684 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qjndv"] Nov 26 12:27:28 crc kubenswrapper[4834]: I1126 12:27:28.981597 4834 scope.go:117] "RemoveContainer" containerID="93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.019415 4834 scope.go:117] "RemoveContainer" containerID="2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75" Nov 26 12:27:29 crc kubenswrapper[4834]: E1126 12:27:29.020007 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75\": container with ID starting with 2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75 not found: ID does not exist" containerID="2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.020039 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75"} err="failed to get container status \"2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75\": rpc error: code = NotFound desc = could not find container \"2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75\": container with ID starting with 2777ee486314d7520cf592de23516f0a504fad17c9b830ed7e3938f0b9ff2d75 not found: ID does not exist" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.020059 4834 scope.go:117] "RemoveContainer" containerID="26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460" Nov 26 12:27:29 crc kubenswrapper[4834]: E1126 12:27:29.020837 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460\": container with ID starting with 26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460 not found: ID does not exist" containerID="26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.020867 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460"} err="failed to get container status \"26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460\": rpc error: code = NotFound desc = could not find container \"26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460\": container with ID starting with 26cb53313bcd5781310ab37993b4186bdb9986ca0dec96307c14f933b1e38460 not found: ID does not exist" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.020889 4834 scope.go:117] "RemoveContainer" containerID="93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7" Nov 26 12:27:29 crc kubenswrapper[4834]: E1126 12:27:29.021242 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7\": container with ID starting with 93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7 not found: ID does not exist" containerID="93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.021271 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7"} err="failed to get container status \"93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7\": rpc error: code = NotFound desc = could not find container \"93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7\": container with ID starting with 93313ba8aa780d325f312b16444c5ceeb96f612df036756d5c590350b970b0c7 not found: ID does not exist" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.843793 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k9tz9"] Nov 26 12:27:29 crc kubenswrapper[4834]: E1126 12:27:29.844487 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon-log" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.844501 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon-log" Nov 26 12:27:29 crc kubenswrapper[4834]: E1126 12:27:29.844510 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerName="registry-server" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.844515 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerName="registry-server" Nov 26 12:27:29 crc kubenswrapper[4834]: E1126 12:27:29.844529 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.844535 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon" Nov 26 12:27:29 crc kubenswrapper[4834]: E1126 12:27:29.844548 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerName="extract-utilities" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.844554 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerName="extract-utilities" Nov 26 12:27:29 crc kubenswrapper[4834]: E1126 12:27:29.844582 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerName="extract-content" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.844588 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerName="extract-content" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.844749 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.844764 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de7fa58-f514-49a8-88b0-205ef138c8a3" containerName="horizon-log" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.844775 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" containerName="registry-server" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.845998 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.872256 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k9tz9"] Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.939787 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph7rn\" (UniqueName: \"kubernetes.io/projected/52608f20-2e28-494a-82b8-4a3ffb789799-kube-api-access-ph7rn\") pod \"redhat-operators-k9tz9\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.939844 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-catalog-content\") pod \"redhat-operators-k9tz9\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.940052 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-utilities\") pod \"redhat-operators-k9tz9\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.944208 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerStarted","Data":"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db"} Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.944261 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.944242 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="ceilometer-central-agent" containerID="cri-o://1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4" gracePeriod=30 Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.944303 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="ceilometer-notification-agent" containerID="cri-o://aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8" gracePeriod=30 Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.944343 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="sg-core" containerID="cri-o://fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314" gracePeriod=30 Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.944382 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="proxy-httpd" containerID="cri-o://168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db" gracePeriod=30 Nov 26 12:27:29 crc kubenswrapper[4834]: I1126 12:27:29.967821 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.036489035 podStartE2EDuration="9.96780752s" podCreationTimestamp="2025-11-26 12:27:20 +0000 UTC" firstStartedPulling="2025-11-26 12:27:21.622358351 +0000 UTC m=+939.529571703" lastFinishedPulling="2025-11-26 12:27:29.553676836 +0000 UTC m=+947.460890188" observedRunningTime="2025-11-26 12:27:29.963500469 +0000 UTC m=+947.870713821" watchObservedRunningTime="2025-11-26 12:27:29.96780752 +0000 UTC m=+947.875020871" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.042729 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-utilities\") pod \"redhat-operators-k9tz9\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.042852 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph7rn\" (UniqueName: \"kubernetes.io/projected/52608f20-2e28-494a-82b8-4a3ffb789799-kube-api-access-ph7rn\") pod \"redhat-operators-k9tz9\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.042875 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-catalog-content\") pod \"redhat-operators-k9tz9\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.043562 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-utilities\") pod \"redhat-operators-k9tz9\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.043589 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-catalog-content\") pod \"redhat-operators-k9tz9\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.062267 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph7rn\" (UniqueName: \"kubernetes.io/projected/52608f20-2e28-494a-82b8-4a3ffb789799-kube-api-access-ph7rn\") pod \"redhat-operators-k9tz9\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.166059 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.427408 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="266b74b0-d7b4-4ce4-b411-6b0fbc048895" path="/var/lib/kubelet/pods/266b74b0-d7b4-4ce4-b411-6b0fbc048895/volumes" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.512729 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.578396 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k9tz9"] Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.653265 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-sg-core-conf-yaml\") pod \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.653361 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-scripts\") pod \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.653402 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-run-httpd\") pod \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.653432 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-config-data\") pod \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.653563 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-combined-ca-bundle\") pod \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.653611 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvn2k\" (UniqueName: \"kubernetes.io/projected/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-kube-api-access-jvn2k\") pod \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.653785 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-log-httpd\") pod \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\" (UID: \"4cb86dca-7a60-42ff-84ba-716e77c9b3c7\") " Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.653922 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4cb86dca-7a60-42ff-84ba-716e77c9b3c7" (UID: "4cb86dca-7a60-42ff-84ba-716e77c9b3c7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.654241 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.654366 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4cb86dca-7a60-42ff-84ba-716e77c9b3c7" (UID: "4cb86dca-7a60-42ff-84ba-716e77c9b3c7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.663212 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-scripts" (OuterVolumeSpecName: "scripts") pod "4cb86dca-7a60-42ff-84ba-716e77c9b3c7" (UID: "4cb86dca-7a60-42ff-84ba-716e77c9b3c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.663251 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-kube-api-access-jvn2k" (OuterVolumeSpecName: "kube-api-access-jvn2k") pod "4cb86dca-7a60-42ff-84ba-716e77c9b3c7" (UID: "4cb86dca-7a60-42ff-84ba-716e77c9b3c7"). InnerVolumeSpecName "kube-api-access-jvn2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.681525 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4cb86dca-7a60-42ff-84ba-716e77c9b3c7" (UID: "4cb86dca-7a60-42ff-84ba-716e77c9b3c7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.714023 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cb86dca-7a60-42ff-84ba-716e77c9b3c7" (UID: "4cb86dca-7a60-42ff-84ba-716e77c9b3c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.743488 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-config-data" (OuterVolumeSpecName: "config-data") pod "4cb86dca-7a60-42ff-84ba-716e77c9b3c7" (UID: "4cb86dca-7a60-42ff-84ba-716e77c9b3c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.756340 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.756379 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvn2k\" (UniqueName: \"kubernetes.io/projected/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-kube-api-access-jvn2k\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.756391 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.756400 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.756409 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.756417 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb86dca-7a60-42ff-84ba-716e77c9b3c7-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.962775 4834 generic.go:334] "Generic (PLEG): container finished" podID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerID="168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db" exitCode=0 Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.962814 4834 generic.go:334] "Generic (PLEG): container finished" podID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerID="fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314" exitCode=2 Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.962826 4834 generic.go:334] "Generic (PLEG): container finished" podID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerID="aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8" exitCode=0 Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.962836 4834 generic.go:334] "Generic (PLEG): container finished" podID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerID="1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4" exitCode=0 Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.962857 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.962862 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerDied","Data":"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db"} Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.962956 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerDied","Data":"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314"} Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.962971 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerDied","Data":"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8"} Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.962999 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerDied","Data":"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4"} Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.963009 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cb86dca-7a60-42ff-84ba-716e77c9b3c7","Type":"ContainerDied","Data":"e82574d088ecb3a085c92a5a753366eb7c3d2cda5f9accdce96cf4ab287b1e2e"} Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.963029 4834 scope.go:117] "RemoveContainer" containerID="168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db" Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.965600 4834 generic.go:334] "Generic (PLEG): container finished" podID="52608f20-2e28-494a-82b8-4a3ffb789799" containerID="95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746" exitCode=0 Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.965639 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9tz9" event={"ID":"52608f20-2e28-494a-82b8-4a3ffb789799","Type":"ContainerDied","Data":"95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746"} Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.965702 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9tz9" event={"ID":"52608f20-2e28-494a-82b8-4a3ffb789799","Type":"ContainerStarted","Data":"ad97bbe9ad9808e2891b22e7c572b0dc1f0ad4f13daa34c79b5f8763e08629ca"} Nov 26 12:27:30 crc kubenswrapper[4834]: I1126 12:27:30.983545 4834 scope.go:117] "RemoveContainer" containerID="fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.004537 4834 scope.go:117] "RemoveContainer" containerID="aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.007577 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.016031 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.024232 4834 scope.go:117] "RemoveContainer" containerID="1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.027823 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:31 crc kubenswrapper[4834]: E1126 12:27:31.028395 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="proxy-httpd" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.028415 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="proxy-httpd" Nov 26 12:27:31 crc kubenswrapper[4834]: E1126 12:27:31.028437 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="ceilometer-notification-agent" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.028444 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="ceilometer-notification-agent" Nov 26 12:27:31 crc kubenswrapper[4834]: E1126 12:27:31.028470 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="ceilometer-central-agent" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.028475 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="ceilometer-central-agent" Nov 26 12:27:31 crc kubenswrapper[4834]: E1126 12:27:31.028489 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="sg-core" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.028496 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="sg-core" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.028647 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="sg-core" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.028668 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="ceilometer-central-agent" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.028684 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="ceilometer-notification-agent" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.028694 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" containerName="proxy-httpd" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.034060 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.037022 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.037921 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.055563 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.058407 4834 scope.go:117] "RemoveContainer" containerID="168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db" Nov 26 12:27:31 crc kubenswrapper[4834]: E1126 12:27:31.058945 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db\": container with ID starting with 168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db not found: ID does not exist" containerID="168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.058978 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db"} err="failed to get container status \"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db\": rpc error: code = NotFound desc = could not find container \"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db\": container with ID starting with 168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.059004 4834 scope.go:117] "RemoveContainer" containerID="fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314" Nov 26 12:27:31 crc kubenswrapper[4834]: E1126 12:27:31.059368 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314\": container with ID starting with fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314 not found: ID does not exist" containerID="fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.059393 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314"} err="failed to get container status \"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314\": rpc error: code = NotFound desc = could not find container \"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314\": container with ID starting with fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.059408 4834 scope.go:117] "RemoveContainer" containerID="aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8" Nov 26 12:27:31 crc kubenswrapper[4834]: E1126 12:27:31.059591 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8\": container with ID starting with aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8 not found: ID does not exist" containerID="aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.059613 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8"} err="failed to get container status \"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8\": rpc error: code = NotFound desc = could not find container \"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8\": container with ID starting with aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.059631 4834 scope.go:117] "RemoveContainer" containerID="1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4" Nov 26 12:27:31 crc kubenswrapper[4834]: E1126 12:27:31.059860 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4\": container with ID starting with 1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4 not found: ID does not exist" containerID="1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.059883 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4"} err="failed to get container status \"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4\": rpc error: code = NotFound desc = could not find container \"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4\": container with ID starting with 1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.059898 4834 scope.go:117] "RemoveContainer" containerID="168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.060041 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db"} err="failed to get container status \"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db\": rpc error: code = NotFound desc = could not find container \"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db\": container with ID starting with 168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.060060 4834 scope.go:117] "RemoveContainer" containerID="fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.060216 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314"} err="failed to get container status \"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314\": rpc error: code = NotFound desc = could not find container \"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314\": container with ID starting with fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.060235 4834 scope.go:117] "RemoveContainer" containerID="aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.061769 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8"} err="failed to get container status \"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8\": rpc error: code = NotFound desc = could not find container \"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8\": container with ID starting with aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.061799 4834 scope.go:117] "RemoveContainer" containerID="1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.062024 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4"} err="failed to get container status \"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4\": rpc error: code = NotFound desc = could not find container \"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4\": container with ID starting with 1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.062047 4834 scope.go:117] "RemoveContainer" containerID="168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.062253 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db"} err="failed to get container status \"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db\": rpc error: code = NotFound desc = could not find container \"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db\": container with ID starting with 168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.062278 4834 scope.go:117] "RemoveContainer" containerID="fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.062623 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314"} err="failed to get container status \"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314\": rpc error: code = NotFound desc = could not find container \"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314\": container with ID starting with fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.062651 4834 scope.go:117] "RemoveContainer" containerID="aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.062865 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8"} err="failed to get container status \"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8\": rpc error: code = NotFound desc = could not find container \"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8\": container with ID starting with aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.062890 4834 scope.go:117] "RemoveContainer" containerID="1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.063071 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4"} err="failed to get container status \"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4\": rpc error: code = NotFound desc = could not find container \"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4\": container with ID starting with 1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.063127 4834 scope.go:117] "RemoveContainer" containerID="168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.063407 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db"} err="failed to get container status \"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db\": rpc error: code = NotFound desc = could not find container \"168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db\": container with ID starting with 168328c88311246d7379c4ef456ede15adba412a0d800ddfc3a6514c63c0e3db not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.063433 4834 scope.go:117] "RemoveContainer" containerID="fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.063642 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314"} err="failed to get container status \"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314\": rpc error: code = NotFound desc = could not find container \"fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314\": container with ID starting with fc26141f26310808b3354c2be7f2aa697e26f1eed0528251b98c853afc3a6314 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.063679 4834 scope.go:117] "RemoveContainer" containerID="aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.063879 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8"} err="failed to get container status \"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8\": rpc error: code = NotFound desc = could not find container \"aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8\": container with ID starting with aba760bdd903d2c30059ce509a9e308de1a7723561dc47e8a1df66f5be87a6d8 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.063898 4834 scope.go:117] "RemoveContainer" containerID="1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.064057 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4"} err="failed to get container status \"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4\": rpc error: code = NotFound desc = could not find container \"1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4\": container with ID starting with 1f54643232bbb4b687d9a799e05b6f301bc6d42cfa4082047dca40198b32dfe4 not found: ID does not exist" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.162260 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-scripts\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.162458 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjzbl\" (UniqueName: \"kubernetes.io/projected/1727213d-f498-4d31-8389-b8d2e93421fe-kube-api-access-rjzbl\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.162534 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.162665 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-run-httpd\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.162764 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.162814 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-config-data\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.162889 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-log-httpd\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.263654 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-log-httpd\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.263838 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-scripts\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.263919 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjzbl\" (UniqueName: \"kubernetes.io/projected/1727213d-f498-4d31-8389-b8d2e93421fe-kube-api-access-rjzbl\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.263984 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.264048 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-run-httpd\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.264119 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.264221 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-config-data\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.265458 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-run-httpd\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.266501 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-log-httpd\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.269909 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.270415 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-config-data\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.271930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-scripts\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.277212 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.298973 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjzbl\" (UniqueName: \"kubernetes.io/projected/1727213d-f498-4d31-8389-b8d2e93421fe-kube-api-access-rjzbl\") pod \"ceilometer-0\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.347988 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.758125 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:31 crc kubenswrapper[4834]: I1126 12:27:31.976649 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerStarted","Data":"2aa3784015a39dca76413402f56b06fd4ba625855b4403fa35631e8cc89dd2d8"} Nov 26 12:27:32 crc kubenswrapper[4834]: I1126 12:27:32.426946 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb86dca-7a60-42ff-84ba-716e77c9b3c7" path="/var/lib/kubelet/pods/4cb86dca-7a60-42ff-84ba-716e77c9b3c7/volumes" Nov 26 12:27:32 crc kubenswrapper[4834]: I1126 12:27:32.987855 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerStarted","Data":"b77a65a6ef53ebc665fe2786518587a8e4568e5766b23a6d9c4fca39177a95a7"} Nov 26 12:27:32 crc kubenswrapper[4834]: I1126 12:27:32.990071 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9tz9" event={"ID":"52608f20-2e28-494a-82b8-4a3ffb789799","Type":"ContainerStarted","Data":"85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f"} Nov 26 12:27:34 crc kubenswrapper[4834]: I1126 12:27:34.005300 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerStarted","Data":"efb98005413717a8b4bb571082fe9c98f02090e0d20bd4b8d144b1e6bf30cc6c"} Nov 26 12:27:34 crc kubenswrapper[4834]: I1126 12:27:34.006857 4834 generic.go:334] "Generic (PLEG): container finished" podID="a2eb7a31-4805-49e5-81cc-58208e57f440" containerID="b6f58900e8721452ffca28c24f35a59f242fd9d8179dbe33d6c438d6f0c41d60" exitCode=0 Nov 26 12:27:34 crc kubenswrapper[4834]: I1126 12:27:34.006949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x7vq5" event={"ID":"a2eb7a31-4805-49e5-81cc-58208e57f440","Type":"ContainerDied","Data":"b6f58900e8721452ffca28c24f35a59f242fd9d8179dbe33d6c438d6f0c41d60"} Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.019416 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerStarted","Data":"64ba6dd2878be898c8cd7e92e66f21de05e428d96ae0232e00cd9aa8c843255f"} Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.021641 4834 generic.go:334] "Generic (PLEG): container finished" podID="52608f20-2e28-494a-82b8-4a3ffb789799" containerID="85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f" exitCode=0 Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.021705 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9tz9" event={"ID":"52608f20-2e28-494a-82b8-4a3ffb789799","Type":"ContainerDied","Data":"85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f"} Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.328093 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.361827 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9sdb\" (UniqueName: \"kubernetes.io/projected/a2eb7a31-4805-49e5-81cc-58208e57f440-kube-api-access-g9sdb\") pod \"a2eb7a31-4805-49e5-81cc-58208e57f440\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.361923 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-combined-ca-bundle\") pod \"a2eb7a31-4805-49e5-81cc-58208e57f440\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.362007 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-config-data\") pod \"a2eb7a31-4805-49e5-81cc-58208e57f440\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.362180 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-scripts\") pod \"a2eb7a31-4805-49e5-81cc-58208e57f440\" (UID: \"a2eb7a31-4805-49e5-81cc-58208e57f440\") " Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.368465 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-scripts" (OuterVolumeSpecName: "scripts") pod "a2eb7a31-4805-49e5-81cc-58208e57f440" (UID: "a2eb7a31-4805-49e5-81cc-58208e57f440"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.381455 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2eb7a31-4805-49e5-81cc-58208e57f440-kube-api-access-g9sdb" (OuterVolumeSpecName: "kube-api-access-g9sdb") pod "a2eb7a31-4805-49e5-81cc-58208e57f440" (UID: "a2eb7a31-4805-49e5-81cc-58208e57f440"). InnerVolumeSpecName "kube-api-access-g9sdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.390385 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-config-data" (OuterVolumeSpecName: "config-data") pod "a2eb7a31-4805-49e5-81cc-58208e57f440" (UID: "a2eb7a31-4805-49e5-81cc-58208e57f440"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.433386 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2eb7a31-4805-49e5-81cc-58208e57f440" (UID: "a2eb7a31-4805-49e5-81cc-58208e57f440"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.464872 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9sdb\" (UniqueName: \"kubernetes.io/projected/a2eb7a31-4805-49e5-81cc-58208e57f440-kube-api-access-g9sdb\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.464906 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.464916 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.464925 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2eb7a31-4805-49e5-81cc-58208e57f440-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:35 crc kubenswrapper[4834]: I1126 12:27:35.670065 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.044697 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="ceilometer-central-agent" containerID="cri-o://b77a65a6ef53ebc665fe2786518587a8e4568e5766b23a6d9c4fca39177a95a7" gracePeriod=30 Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.045330 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerStarted","Data":"aa7615db5ea456c3fa2ec98d904c6f4ad9a46f818342a9757a67de8749e68823"} Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.045399 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.045724 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="proxy-httpd" containerID="cri-o://aa7615db5ea456c3fa2ec98d904c6f4ad9a46f818342a9757a67de8749e68823" gracePeriod=30 Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.045801 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="sg-core" containerID="cri-o://64ba6dd2878be898c8cd7e92e66f21de05e428d96ae0232e00cd9aa8c843255f" gracePeriod=30 Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.045851 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="ceilometer-notification-agent" containerID="cri-o://efb98005413717a8b4bb571082fe9c98f02090e0d20bd4b8d144b1e6bf30cc6c" gracePeriod=30 Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.051161 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9tz9" event={"ID":"52608f20-2e28-494a-82b8-4a3ffb789799","Type":"ContainerStarted","Data":"91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114"} Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.053229 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x7vq5" event={"ID":"a2eb7a31-4805-49e5-81cc-58208e57f440","Type":"ContainerDied","Data":"9f69a68cb551a4f9bb9a8fc4dfee84d1ffa927111338e0a2de772e716d7b1e88"} Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.053265 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f69a68cb551a4f9bb9a8fc4dfee84d1ffa927111338e0a2de772e716d7b1e88" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.053350 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x7vq5" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.100986 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.275337329 podStartE2EDuration="5.100965419s" podCreationTimestamp="2025-11-26 12:27:31 +0000 UTC" firstStartedPulling="2025-11-26 12:27:31.769116294 +0000 UTC m=+949.676329647" lastFinishedPulling="2025-11-26 12:27:35.594744385 +0000 UTC m=+953.501957737" observedRunningTime="2025-11-26 12:27:36.069244361 +0000 UTC m=+953.976457712" watchObservedRunningTime="2025-11-26 12:27:36.100965419 +0000 UTC m=+954.008178771" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.118983 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k9tz9" podStartSLOduration=2.520278983 podStartE2EDuration="7.118962484s" podCreationTimestamp="2025-11-26 12:27:29 +0000 UTC" firstStartedPulling="2025-11-26 12:27:30.967617215 +0000 UTC m=+948.874830567" lastFinishedPulling="2025-11-26 12:27:35.566300717 +0000 UTC m=+953.473514068" observedRunningTime="2025-11-26 12:27:36.082955321 +0000 UTC m=+953.990168673" watchObservedRunningTime="2025-11-26 12:27:36.118962484 +0000 UTC m=+954.026175836" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.129815 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 12:27:36 crc kubenswrapper[4834]: E1126 12:27:36.130387 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2eb7a31-4805-49e5-81cc-58208e57f440" containerName="nova-cell0-conductor-db-sync" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.130409 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2eb7a31-4805-49e5-81cc-58208e57f440" containerName="nova-cell0-conductor-db-sync" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.130605 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2eb7a31-4805-49e5-81cc-58208e57f440" containerName="nova-cell0-conductor-db-sync" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.131434 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.132987 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-r8sxf" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.134744 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.143457 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.281022 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e75301-90f4-4498-b64f-027c0fc4b257-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f7e75301-90f4-4498-b64f-027c0fc4b257\") " pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.281065 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnsnn\" (UniqueName: \"kubernetes.io/projected/f7e75301-90f4-4498-b64f-027c0fc4b257-kube-api-access-dnsnn\") pod \"nova-cell0-conductor-0\" (UID: \"f7e75301-90f4-4498-b64f-027c0fc4b257\") " pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.281198 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e75301-90f4-4498-b64f-027c0fc4b257-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f7e75301-90f4-4498-b64f-027c0fc4b257\") " pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.383034 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e75301-90f4-4498-b64f-027c0fc4b257-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f7e75301-90f4-4498-b64f-027c0fc4b257\") " pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.383117 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnsnn\" (UniqueName: \"kubernetes.io/projected/f7e75301-90f4-4498-b64f-027c0fc4b257-kube-api-access-dnsnn\") pod \"nova-cell0-conductor-0\" (UID: \"f7e75301-90f4-4498-b64f-027c0fc4b257\") " pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.383283 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e75301-90f4-4498-b64f-027c0fc4b257-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f7e75301-90f4-4498-b64f-027c0fc4b257\") " pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.388421 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e75301-90f4-4498-b64f-027c0fc4b257-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f7e75301-90f4-4498-b64f-027c0fc4b257\") " pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.389753 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e75301-90f4-4498-b64f-027c0fc4b257-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f7e75301-90f4-4498-b64f-027c0fc4b257\") " pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.402790 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnsnn\" (UniqueName: \"kubernetes.io/projected/f7e75301-90f4-4498-b64f-027c0fc4b257-kube-api-access-dnsnn\") pod \"nova-cell0-conductor-0\" (UID: \"f7e75301-90f4-4498-b64f-027c0fc4b257\") " pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.450786 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:36 crc kubenswrapper[4834]: I1126 12:27:36.875427 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.067750 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f7e75301-90f4-4498-b64f-027c0fc4b257","Type":"ContainerStarted","Data":"d84504651f29a06c3300efe5f9ea222be403954f9d499dd4ab1ffe39b5b09f51"} Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.067813 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f7e75301-90f4-4498-b64f-027c0fc4b257","Type":"ContainerStarted","Data":"55707d0f01737f68385d1d2c9fe950b7b4542108554f983c6803c915a0816a49"} Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.067911 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.077529 4834 generic.go:334] "Generic (PLEG): container finished" podID="1727213d-f498-4d31-8389-b8d2e93421fe" containerID="aa7615db5ea456c3fa2ec98d904c6f4ad9a46f818342a9757a67de8749e68823" exitCode=0 Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.077567 4834 generic.go:334] "Generic (PLEG): container finished" podID="1727213d-f498-4d31-8389-b8d2e93421fe" containerID="64ba6dd2878be898c8cd7e92e66f21de05e428d96ae0232e00cd9aa8c843255f" exitCode=2 Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.077578 4834 generic.go:334] "Generic (PLEG): container finished" podID="1727213d-f498-4d31-8389-b8d2e93421fe" containerID="efb98005413717a8b4bb571082fe9c98f02090e0d20bd4b8d144b1e6bf30cc6c" exitCode=0 Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.077591 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerDied","Data":"aa7615db5ea456c3fa2ec98d904c6f4ad9a46f818342a9757a67de8749e68823"} Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.077679 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerDied","Data":"64ba6dd2878be898c8cd7e92e66f21de05e428d96ae0232e00cd9aa8c843255f"} Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.077694 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerDied","Data":"efb98005413717a8b4bb571082fe9c98f02090e0d20bd4b8d144b1e6bf30cc6c"} Nov 26 12:27:37 crc kubenswrapper[4834]: I1126 12:27:37.095395 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.095385909 podStartE2EDuration="1.095385909s" podCreationTimestamp="2025-11-26 12:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:27:37.090139113 +0000 UTC m=+954.997352465" watchObservedRunningTime="2025-11-26 12:27:37.095385909 +0000 UTC m=+955.002599262" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.112215 4834 generic.go:334] "Generic (PLEG): container finished" podID="1727213d-f498-4d31-8389-b8d2e93421fe" containerID="b77a65a6ef53ebc665fe2786518587a8e4568e5766b23a6d9c4fca39177a95a7" exitCode=0 Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.112346 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerDied","Data":"b77a65a6ef53ebc665fe2786518587a8e4568e5766b23a6d9c4fca39177a95a7"} Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.166632 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.166677 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.199963 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.371039 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-combined-ca-bundle\") pod \"1727213d-f498-4d31-8389-b8d2e93421fe\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.371087 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjzbl\" (UniqueName: \"kubernetes.io/projected/1727213d-f498-4d31-8389-b8d2e93421fe-kube-api-access-rjzbl\") pod \"1727213d-f498-4d31-8389-b8d2e93421fe\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.371121 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-log-httpd\") pod \"1727213d-f498-4d31-8389-b8d2e93421fe\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.371146 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-config-data\") pod \"1727213d-f498-4d31-8389-b8d2e93421fe\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.371192 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-sg-core-conf-yaml\") pod \"1727213d-f498-4d31-8389-b8d2e93421fe\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.371260 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-run-httpd\") pod \"1727213d-f498-4d31-8389-b8d2e93421fe\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.371288 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-scripts\") pod \"1727213d-f498-4d31-8389-b8d2e93421fe\" (UID: \"1727213d-f498-4d31-8389-b8d2e93421fe\") " Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.372113 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1727213d-f498-4d31-8389-b8d2e93421fe" (UID: "1727213d-f498-4d31-8389-b8d2e93421fe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.372263 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1727213d-f498-4d31-8389-b8d2e93421fe" (UID: "1727213d-f498-4d31-8389-b8d2e93421fe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.376944 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1727213d-f498-4d31-8389-b8d2e93421fe-kube-api-access-rjzbl" (OuterVolumeSpecName: "kube-api-access-rjzbl") pod "1727213d-f498-4d31-8389-b8d2e93421fe" (UID: "1727213d-f498-4d31-8389-b8d2e93421fe"). InnerVolumeSpecName "kube-api-access-rjzbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.377019 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-scripts" (OuterVolumeSpecName: "scripts") pod "1727213d-f498-4d31-8389-b8d2e93421fe" (UID: "1727213d-f498-4d31-8389-b8d2e93421fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.394956 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1727213d-f498-4d31-8389-b8d2e93421fe" (UID: "1727213d-f498-4d31-8389-b8d2e93421fe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.425384 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1727213d-f498-4d31-8389-b8d2e93421fe" (UID: "1727213d-f498-4d31-8389-b8d2e93421fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.446255 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-config-data" (OuterVolumeSpecName: "config-data") pod "1727213d-f498-4d31-8389-b8d2e93421fe" (UID: "1727213d-f498-4d31-8389-b8d2e93421fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.474276 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.474326 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.474341 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjzbl\" (UniqueName: \"kubernetes.io/projected/1727213d-f498-4d31-8389-b8d2e93421fe-kube-api-access-rjzbl\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.474352 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.474360 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.474368 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1727213d-f498-4d31-8389-b8d2e93421fe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:40 crc kubenswrapper[4834]: I1126 12:27:40.474377 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1727213d-f498-4d31-8389-b8d2e93421fe-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.127259 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1727213d-f498-4d31-8389-b8d2e93421fe","Type":"ContainerDied","Data":"2aa3784015a39dca76413402f56b06fd4ba625855b4403fa35631e8cc89dd2d8"} Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.127301 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.127353 4834 scope.go:117] "RemoveContainer" containerID="aa7615db5ea456c3fa2ec98d904c6f4ad9a46f818342a9757a67de8749e68823" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.151885 4834 scope.go:117] "RemoveContainer" containerID="64ba6dd2878be898c8cd7e92e66f21de05e428d96ae0232e00cd9aa8c843255f" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.160694 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.176388 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.186041 4834 scope.go:117] "RemoveContainer" containerID="efb98005413717a8b4bb571082fe9c98f02090e0d20bd4b8d144b1e6bf30cc6c" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.189862 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:41 crc kubenswrapper[4834]: E1126 12:27:41.190259 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="sg-core" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.190276 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="sg-core" Nov 26 12:27:41 crc kubenswrapper[4834]: E1126 12:27:41.190290 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="ceilometer-central-agent" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.190297 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="ceilometer-central-agent" Nov 26 12:27:41 crc kubenswrapper[4834]: E1126 12:27:41.190334 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="ceilometer-notification-agent" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.190342 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="ceilometer-notification-agent" Nov 26 12:27:41 crc kubenswrapper[4834]: E1126 12:27:41.190371 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="proxy-httpd" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.190377 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="proxy-httpd" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.190548 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="proxy-httpd" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.190570 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="sg-core" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.190588 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="ceilometer-central-agent" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.190598 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" containerName="ceilometer-notification-agent" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.192524 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.194811 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.195050 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.203361 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k9tz9" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" containerName="registry-server" probeResult="failure" output=< Nov 26 12:27:41 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Nov 26 12:27:41 crc kubenswrapper[4834]: > Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.205504 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.230852 4834 scope.go:117] "RemoveContainer" containerID="b77a65a6ef53ebc665fe2786518587a8e4568e5766b23a6d9c4fca39177a95a7" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.288632 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.288714 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-scripts\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.288775 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-run-httpd\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.288828 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.289103 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5sdx\" (UniqueName: \"kubernetes.io/projected/34aac5bd-c148-4b6b-a39f-cdbed6aca483-kube-api-access-n5sdx\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.289369 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-log-httpd\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.289407 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-config-data\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.390345 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5sdx\" (UniqueName: \"kubernetes.io/projected/34aac5bd-c148-4b6b-a39f-cdbed6aca483-kube-api-access-n5sdx\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.390407 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-config-data\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.390427 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-log-httpd\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.390468 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.390515 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-scripts\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.390550 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-run-httpd\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.390579 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.391013 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-log-httpd\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.391086 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-run-httpd\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.396937 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-scripts\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.396968 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.397842 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-config-data\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.402687 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.405420 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5sdx\" (UniqueName: \"kubernetes.io/projected/34aac5bd-c148-4b6b-a39f-cdbed6aca483-kube-api-access-n5sdx\") pod \"ceilometer-0\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.509368 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:27:41 crc kubenswrapper[4834]: I1126 12:27:41.906372 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:27:42 crc kubenswrapper[4834]: I1126 12:27:42.138194 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerStarted","Data":"80786b9bbd0400d5431097bd571294574dee6f69590796d32713ee8fbc880690"} Nov 26 12:27:42 crc kubenswrapper[4834]: I1126 12:27:42.427933 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1727213d-f498-4d31-8389-b8d2e93421fe" path="/var/lib/kubelet/pods/1727213d-f498-4d31-8389-b8d2e93421fe/volumes" Nov 26 12:27:43 crc kubenswrapper[4834]: I1126 12:27:43.149602 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerStarted","Data":"ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b"} Nov 26 12:27:44 crc kubenswrapper[4834]: I1126 12:27:44.158150 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerStarted","Data":"7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d"} Nov 26 12:27:45 crc kubenswrapper[4834]: I1126 12:27:45.169039 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerStarted","Data":"7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0"} Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.181289 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerStarted","Data":"246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001"} Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.182180 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.208087 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4827491830000001 podStartE2EDuration="5.208074722s" podCreationTimestamp="2025-11-26 12:27:41 +0000 UTC" firstStartedPulling="2025-11-26 12:27:41.915018874 +0000 UTC m=+959.822232226" lastFinishedPulling="2025-11-26 12:27:45.640344413 +0000 UTC m=+963.547557765" observedRunningTime="2025-11-26 12:27:46.204953936 +0000 UTC m=+964.112167288" watchObservedRunningTime="2025-11-26 12:27:46.208074722 +0000 UTC m=+964.115288073" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.482540 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.899787 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-b558t"] Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.900790 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.908615 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.908734 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.910288 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.910360 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-config-data\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.910386 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-scripts\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.910431 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmqd\" (UniqueName: \"kubernetes.io/projected/f54e62ed-7746-4227-957c-febe65052a53-kube-api-access-pvmqd\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:46 crc kubenswrapper[4834]: I1126 12:27:46.911397 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-b558t"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.011663 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmqd\" (UniqueName: \"kubernetes.io/projected/f54e62ed-7746-4227-957c-febe65052a53-kube-api-access-pvmqd\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.011773 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.011827 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-config-data\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.011857 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-scripts\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.023249 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.027781 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-config-data\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.031343 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-scripts\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.033217 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.034446 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.034733 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmqd\" (UniqueName: \"kubernetes.io/projected/f54e62ed-7746-4227-957c-febe65052a53-kube-api-access-pvmqd\") pod \"nova-cell0-cell-mapping-b558t\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.038702 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.075387 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.113325 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.115636 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-config-data\") pod \"nova-scheduler-0\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.115687 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcjx\" (UniqueName: \"kubernetes.io/projected/9d680001-4cac-43d0-87d1-d42f100ce726-kube-api-access-qmcjx\") pod \"nova-scheduler-0\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.115721 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.115740 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.125727 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.142458 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.152592 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.153866 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.156910 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.172715 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217178 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7wt2\" (UniqueName: \"kubernetes.io/projected/fde4d925-7270-403a-b047-9533a8a61c3c-kube-api-access-z7wt2\") pod \"nova-cell1-novncproxy-0\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217229 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9896132d-9ec8-4055-8f48-af08374401fd-logs\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217406 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-config-data\") pod \"nova-scheduler-0\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217439 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcjx\" (UniqueName: \"kubernetes.io/projected/9d680001-4cac-43d0-87d1-d42f100ce726-kube-api-access-qmcjx\") pod \"nova-scheduler-0\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217478 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217516 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72kss\" (UniqueName: \"kubernetes.io/projected/9896132d-9ec8-4055-8f48-af08374401fd-kube-api-access-72kss\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217711 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217792 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-config-data\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217833 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.217917 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.222342 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.224443 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f7bbc55bc-zj66h"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.226872 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.228251 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.236446 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-config-data\") pod \"nova-scheduler-0\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.239251 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcjx\" (UniqueName: \"kubernetes.io/projected/9d680001-4cac-43d0-87d1-d42f100ce726-kube-api-access-qmcjx\") pod \"nova-scheduler-0\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.241256 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f7bbc55bc-zj66h"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.283035 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.285018 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.289211 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.295692 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.334509 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72kss\" (UniqueName: \"kubernetes.io/projected/9896132d-9ec8-4055-8f48-af08374401fd-kube-api-access-72kss\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.334630 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.334679 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-config-data\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.334708 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.334752 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.334773 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7wt2\" (UniqueName: \"kubernetes.io/projected/fde4d925-7270-403a-b047-9533a8a61c3c-kube-api-access-z7wt2\") pod \"nova-cell1-novncproxy-0\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.334791 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9896132d-9ec8-4055-8f48-af08374401fd-logs\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.335275 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9896132d-9ec8-4055-8f48-af08374401fd-logs\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.342285 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.343097 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-config-data\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.340065 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.343803 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.352717 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7wt2\" (UniqueName: \"kubernetes.io/projected/fde4d925-7270-403a-b047-9533a8a61c3c-kube-api-access-z7wt2\") pod \"nova-cell1-novncproxy-0\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.356438 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72kss\" (UniqueName: \"kubernetes.io/projected/9896132d-9ec8-4055-8f48-af08374401fd-kube-api-access-72kss\") pod \"nova-metadata-0\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.427431 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.440063 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-sb\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.440100 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-config-data\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.440157 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djdcr\" (UniqueName: \"kubernetes.io/projected/5599c395-4375-4e68-bcc3-448e61f2ee1d-kube-api-access-djdcr\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.440232 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-config\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.440335 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-nb\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.440362 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.440396 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fba220-facf-4b30-8525-a923cc54bda7-logs\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.440421 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8bh7\" (UniqueName: \"kubernetes.io/projected/26fba220-facf-4b30-8525-a923cc54bda7-kube-api-access-w8bh7\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.440495 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-dns-svc\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.454680 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.484790 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.541962 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-config\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.542286 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-nb\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.542333 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.542380 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fba220-facf-4b30-8525-a923cc54bda7-logs\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.542409 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8bh7\" (UniqueName: \"kubernetes.io/projected/26fba220-facf-4b30-8525-a923cc54bda7-kube-api-access-w8bh7\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.542502 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-dns-svc\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.542575 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-sb\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.542592 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-config-data\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.542621 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djdcr\" (UniqueName: \"kubernetes.io/projected/5599c395-4375-4e68-bcc3-448e61f2ee1d-kube-api-access-djdcr\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.543803 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-config\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.544447 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-nb\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.547219 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fba220-facf-4b30-8525-a923cc54bda7-logs\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.547832 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-sb\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.548725 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-dns-svc\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.551505 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.559038 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-config-data\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.572362 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djdcr\" (UniqueName: \"kubernetes.io/projected/5599c395-4375-4e68-bcc3-448e61f2ee1d-kube-api-access-djdcr\") pod \"dnsmasq-dns-f7bbc55bc-zj66h\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.576127 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8bh7\" (UniqueName: \"kubernetes.io/projected/26fba220-facf-4b30-8525-a923cc54bda7-kube-api-access-w8bh7\") pod \"nova-api-0\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.670404 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.700361 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:27:47 crc kubenswrapper[4834]: I1126 12:27:47.882707 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-b558t"] Nov 26 12:27:47 crc kubenswrapper[4834]: W1126 12:27:47.941332 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf54e62ed_7746_4227_957c_febe65052a53.slice/crio-56f116b952511886b9dfebc17bce73c5374dac9900b6cf4d35e2d760b2f49ae1 WatchSource:0}: Error finding container 56f116b952511886b9dfebc17bce73c5374dac9900b6cf4d35e2d760b2f49ae1: Status 404 returned error can't find the container with id 56f116b952511886b9dfebc17bce73c5374dac9900b6cf4d35e2d760b2f49ae1 Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.043124 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.219210 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b558t" event={"ID":"f54e62ed-7746-4227-957c-febe65052a53","Type":"ContainerStarted","Data":"56f116b952511886b9dfebc17bce73c5374dac9900b6cf4d35e2d760b2f49ae1"} Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.221567 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d680001-4cac-43d0-87d1-d42f100ce726","Type":"ContainerStarted","Data":"d314496789a438b9a2f30b0d0c702cf4c490ac24f97d01816197212ddb452c25"} Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.244428 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.282836 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.444369 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f7bbc55bc-zj66h"] Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.465364 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gnszd"] Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.466781 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.469108 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.469595 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.491707 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gnszd"] Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.501975 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.502068 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdsp2\" (UniqueName: \"kubernetes.io/projected/163ea76c-b946-4d71-ab57-fc60b515cced-kube-api-access-qdsp2\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.502197 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-scripts\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.502242 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-config-data\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.540198 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:27:48 crc kubenswrapper[4834]: W1126 12:27:48.542208 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26fba220_facf_4b30_8525_a923cc54bda7.slice/crio-c0699ce7d7dcbd54bc6fc3c67ad251cd539daf831bdc0a951f3dee8e0367935a WatchSource:0}: Error finding container c0699ce7d7dcbd54bc6fc3c67ad251cd539daf831bdc0a951f3dee8e0367935a: Status 404 returned error can't find the container with id c0699ce7d7dcbd54bc6fc3c67ad251cd539daf831bdc0a951f3dee8e0367935a Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.605634 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.605772 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdsp2\" (UniqueName: \"kubernetes.io/projected/163ea76c-b946-4d71-ab57-fc60b515cced-kube-api-access-qdsp2\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.606161 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-scripts\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.606274 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-config-data\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.610256 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.610618 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-scripts\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.610784 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-config-data\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.624454 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdsp2\" (UniqueName: \"kubernetes.io/projected/163ea76c-b946-4d71-ab57-fc60b515cced-kube-api-access-qdsp2\") pod \"nova-cell1-conductor-db-sync-gnszd\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:48 crc kubenswrapper[4834]: I1126 12:27:48.797161 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:49 crc kubenswrapper[4834]: I1126 12:27:49.244986 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9896132d-9ec8-4055-8f48-af08374401fd","Type":"ContainerStarted","Data":"513673236f80cf6a5e61dbad0a2d82f8447e0bcd0de68a9ab3d4c4c104c3b56e"} Nov 26 12:27:49 crc kubenswrapper[4834]: I1126 12:27:49.247144 4834 generic.go:334] "Generic (PLEG): container finished" podID="5599c395-4375-4e68-bcc3-448e61f2ee1d" containerID="61b452d4a80ec245831983ced0fcc0a0c4e85024681ad711651cb863513b1904" exitCode=0 Nov 26 12:27:49 crc kubenswrapper[4834]: I1126 12:27:49.247192 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" event={"ID":"5599c395-4375-4e68-bcc3-448e61f2ee1d","Type":"ContainerDied","Data":"61b452d4a80ec245831983ced0fcc0a0c4e85024681ad711651cb863513b1904"} Nov 26 12:27:49 crc kubenswrapper[4834]: I1126 12:27:49.247211 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" event={"ID":"5599c395-4375-4e68-bcc3-448e61f2ee1d","Type":"ContainerStarted","Data":"b3c1667d91bbf90535a131b80b044d9dc9c29427726af4ca01aa1b6664c0ac6b"} Nov 26 12:27:49 crc kubenswrapper[4834]: I1126 12:27:49.248903 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fde4d925-7270-403a-b047-9533a8a61c3c","Type":"ContainerStarted","Data":"7caba0e77c5377a82c37188ae0d11b8237a34c4f481aa5fbddeb6580ccb507c6"} Nov 26 12:27:49 crc kubenswrapper[4834]: I1126 12:27:49.250747 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b558t" event={"ID":"f54e62ed-7746-4227-957c-febe65052a53","Type":"ContainerStarted","Data":"b71b9afc11dd45f6015d69187b8a583d7a421bbd5b2d5db75e66cd6ac4b57ee5"} Nov 26 12:27:49 crc kubenswrapper[4834]: I1126 12:27:49.254345 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26fba220-facf-4b30-8525-a923cc54bda7","Type":"ContainerStarted","Data":"c0699ce7d7dcbd54bc6fc3c67ad251cd539daf831bdc0a951f3dee8e0367935a"} Nov 26 12:27:49 crc kubenswrapper[4834]: I1126 12:27:49.311262 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-b558t" podStartSLOduration=3.311241344 podStartE2EDuration="3.311241344s" podCreationTimestamp="2025-11-26 12:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:27:49.292078411 +0000 UTC m=+967.199291763" watchObservedRunningTime="2025-11-26 12:27:49.311241344 +0000 UTC m=+967.218454696" Nov 26 12:27:49 crc kubenswrapper[4834]: W1126 12:27:49.335773 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod163ea76c_b946_4d71_ab57_fc60b515cced.slice/crio-98657720c7f3b5a394b8a66424a4aae425a1c5b01a43d0137d12d3b02603bceb WatchSource:0}: Error finding container 98657720c7f3b5a394b8a66424a4aae425a1c5b01a43d0137d12d3b02603bceb: Status 404 returned error can't find the container with id 98657720c7f3b5a394b8a66424a4aae425a1c5b01a43d0137d12d3b02603bceb Nov 26 12:27:49 crc kubenswrapper[4834]: I1126 12:27:49.358101 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gnszd"] Nov 26 12:27:50 crc kubenswrapper[4834]: I1126 12:27:50.220404 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:50 crc kubenswrapper[4834]: I1126 12:27:50.266567 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:50 crc kubenswrapper[4834]: I1126 12:27:50.274269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gnszd" event={"ID":"163ea76c-b946-4d71-ab57-fc60b515cced","Type":"ContainerStarted","Data":"98657720c7f3b5a394b8a66424a4aae425a1c5b01a43d0137d12d3b02603bceb"} Nov 26 12:27:50 crc kubenswrapper[4834]: I1126 12:27:50.276229 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" event={"ID":"5599c395-4375-4e68-bcc3-448e61f2ee1d","Type":"ContainerStarted","Data":"12f67a3e666e156e51fd52e8017a07107066c703e0a539b1fbca75516d66a312"} Nov 26 12:27:50 crc kubenswrapper[4834]: I1126 12:27:50.276586 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:50 crc kubenswrapper[4834]: I1126 12:27:50.310597 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" podStartSLOduration=3.310580931 podStartE2EDuration="3.310580931s" podCreationTimestamp="2025-11-26 12:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:27:50.295920952 +0000 UTC m=+968.203134304" watchObservedRunningTime="2025-11-26 12:27:50.310580931 +0000 UTC m=+968.217794284" Nov 26 12:27:50 crc kubenswrapper[4834]: I1126 12:27:50.454344 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k9tz9"] Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.045357 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.055723 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.287264 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d680001-4cac-43d0-87d1-d42f100ce726","Type":"ContainerStarted","Data":"37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c"} Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.289980 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fde4d925-7270-403a-b047-9533a8a61c3c","Type":"ContainerStarted","Data":"f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc"} Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.290115 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fde4d925-7270-403a-b047-9533a8a61c3c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc" gracePeriod=30 Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.293238 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gnszd" event={"ID":"163ea76c-b946-4d71-ab57-fc60b515cced","Type":"ContainerStarted","Data":"237ac414a7e30fbabaa31c28f26bdb44891e631402bca7a56cd5e08e0521f6fe"} Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.295884 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26fba220-facf-4b30-8525-a923cc54bda7","Type":"ContainerStarted","Data":"2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb"} Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.303786 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9896132d-9ec8-4055-8f48-af08374401fd","Type":"ContainerStarted","Data":"136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e"} Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.304220 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k9tz9" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" containerName="registry-server" containerID="cri-o://91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114" gracePeriod=2 Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.319863 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.5406627240000001 podStartE2EDuration="4.319841172s" podCreationTimestamp="2025-11-26 12:27:47 +0000 UTC" firstStartedPulling="2025-11-26 12:27:48.092446652 +0000 UTC m=+965.999659993" lastFinishedPulling="2025-11-26 12:27:50.871625089 +0000 UTC m=+968.778838441" observedRunningTime="2025-11-26 12:27:51.30313383 +0000 UTC m=+969.210347182" watchObservedRunningTime="2025-11-26 12:27:51.319841172 +0000 UTC m=+969.227054524" Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.334118 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-gnszd" podStartSLOduration=3.334094886 podStartE2EDuration="3.334094886s" podCreationTimestamp="2025-11-26 12:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:27:51.33227882 +0000 UTC m=+969.239492172" watchObservedRunningTime="2025-11-26 12:27:51.334094886 +0000 UTC m=+969.241308237" Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.336910 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.730674746 podStartE2EDuration="4.336887924s" podCreationTimestamp="2025-11-26 12:27:47 +0000 UTC" firstStartedPulling="2025-11-26 12:27:48.271321458 +0000 UTC m=+966.178534809" lastFinishedPulling="2025-11-26 12:27:50.877534634 +0000 UTC m=+968.784747987" observedRunningTime="2025-11-26 12:27:51.321858355 +0000 UTC m=+969.229071708" watchObservedRunningTime="2025-11-26 12:27:51.336887924 +0000 UTC m=+969.244101276" Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.687992 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.783048 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph7rn\" (UniqueName: \"kubernetes.io/projected/52608f20-2e28-494a-82b8-4a3ffb789799-kube-api-access-ph7rn\") pod \"52608f20-2e28-494a-82b8-4a3ffb789799\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.783132 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-catalog-content\") pod \"52608f20-2e28-494a-82b8-4a3ffb789799\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.783255 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-utilities\") pod \"52608f20-2e28-494a-82b8-4a3ffb789799\" (UID: \"52608f20-2e28-494a-82b8-4a3ffb789799\") " Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.784228 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-utilities" (OuterVolumeSpecName: "utilities") pod "52608f20-2e28-494a-82b8-4a3ffb789799" (UID: "52608f20-2e28-494a-82b8-4a3ffb789799"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.787650 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52608f20-2e28-494a-82b8-4a3ffb789799-kube-api-access-ph7rn" (OuterVolumeSpecName: "kube-api-access-ph7rn") pod "52608f20-2e28-494a-82b8-4a3ffb789799" (UID: "52608f20-2e28-494a-82b8-4a3ffb789799"). InnerVolumeSpecName "kube-api-access-ph7rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.846858 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52608f20-2e28-494a-82b8-4a3ffb789799" (UID: "52608f20-2e28-494a-82b8-4a3ffb789799"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.886364 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.886401 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52608f20-2e28-494a-82b8-4a3ffb789799-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:51 crc kubenswrapper[4834]: I1126 12:27:51.886415 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph7rn\" (UniqueName: \"kubernetes.io/projected/52608f20-2e28-494a-82b8-4a3ffb789799-kube-api-access-ph7rn\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.323036 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26fba220-facf-4b30-8525-a923cc54bda7","Type":"ContainerStarted","Data":"75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b"} Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.325409 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9896132d-9ec8-4055-8f48-af08374401fd","Type":"ContainerStarted","Data":"a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d"} Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.325526 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9896132d-9ec8-4055-8f48-af08374401fd" containerName="nova-metadata-log" containerID="cri-o://136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e" gracePeriod=30 Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.325548 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9896132d-9ec8-4055-8f48-af08374401fd" containerName="nova-metadata-metadata" containerID="cri-o://a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d" gracePeriod=30 Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.330753 4834 generic.go:334] "Generic (PLEG): container finished" podID="52608f20-2e28-494a-82b8-4a3ffb789799" containerID="91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114" exitCode=0 Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.331451 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9tz9" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.332974 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9tz9" event={"ID":"52608f20-2e28-494a-82b8-4a3ffb789799","Type":"ContainerDied","Data":"91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114"} Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.333021 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9tz9" event={"ID":"52608f20-2e28-494a-82b8-4a3ffb789799","Type":"ContainerDied","Data":"ad97bbe9ad9808e2891b22e7c572b0dc1f0ad4f13daa34c79b5f8763e08629ca"} Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.333046 4834 scope.go:117] "RemoveContainer" containerID="91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.363464 4834 scope.go:117] "RemoveContainer" containerID="85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.377233 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.047904467 podStartE2EDuration="5.377196462s" podCreationTimestamp="2025-11-26 12:27:47 +0000 UTC" firstStartedPulling="2025-11-26 12:27:48.54390021 +0000 UTC m=+966.451113562" lastFinishedPulling="2025-11-26 12:27:50.873192205 +0000 UTC m=+968.780405557" observedRunningTime="2025-11-26 12:27:52.354040237 +0000 UTC m=+970.261253590" watchObservedRunningTime="2025-11-26 12:27:52.377196462 +0000 UTC m=+970.284409815" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.382837 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.808043434 podStartE2EDuration="5.382819558s" podCreationTimestamp="2025-11-26 12:27:47 +0000 UTC" firstStartedPulling="2025-11-26 12:27:48.302766246 +0000 UTC m=+966.209979598" lastFinishedPulling="2025-11-26 12:27:50.87754237 +0000 UTC m=+968.784755722" observedRunningTime="2025-11-26 12:27:52.372919225 +0000 UTC m=+970.280132578" watchObservedRunningTime="2025-11-26 12:27:52.382819558 +0000 UTC m=+970.290032910" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.416065 4834 scope.go:117] "RemoveContainer" containerID="95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.447610 4834 scope.go:117] "RemoveContainer" containerID="91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114" Nov 26 12:27:52 crc kubenswrapper[4834]: E1126 12:27:52.460332 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114\": container with ID starting with 91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114 not found: ID does not exist" containerID="91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.460366 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114"} err="failed to get container status \"91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114\": rpc error: code = NotFound desc = could not find container \"91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114\": container with ID starting with 91c44fcda120c03f2b4bb961a6b524e75582aa2970d7fbe9a9c648f6e8266114 not found: ID does not exist" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.460390 4834 scope.go:117] "RemoveContainer" containerID="85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f" Nov 26 12:27:52 crc kubenswrapper[4834]: E1126 12:27:52.464417 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f\": container with ID starting with 85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f not found: ID does not exist" containerID="85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.464670 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f"} err="failed to get container status \"85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f\": rpc error: code = NotFound desc = could not find container \"85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f\": container with ID starting with 85567779b15ec1016b677982f043ceea9b7c3d3e0f74b3dca07d24cae3d7966f not found: ID does not exist" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.464683 4834 scope.go:117] "RemoveContainer" containerID="95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746" Nov 26 12:27:52 crc kubenswrapper[4834]: E1126 12:27:52.464970 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746\": container with ID starting with 95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746 not found: ID does not exist" containerID="95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.464988 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746"} err="failed to get container status \"95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746\": rpc error: code = NotFound desc = could not find container \"95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746\": container with ID starting with 95ecf1a966e742be4d9938cbc772d086809e29ec899388b29b466de301ba3746 not found: ID does not exist" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.475140 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.475182 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.475192 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.475204 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k9tz9"] Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.475223 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k9tz9"] Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.485398 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.772863 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.804824 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9896132d-9ec8-4055-8f48-af08374401fd-logs\") pod \"9896132d-9ec8-4055-8f48-af08374401fd\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.805047 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-config-data\") pod \"9896132d-9ec8-4055-8f48-af08374401fd\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.805137 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72kss\" (UniqueName: \"kubernetes.io/projected/9896132d-9ec8-4055-8f48-af08374401fd-kube-api-access-72kss\") pod \"9896132d-9ec8-4055-8f48-af08374401fd\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.805260 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-combined-ca-bundle\") pod \"9896132d-9ec8-4055-8f48-af08374401fd\" (UID: \"9896132d-9ec8-4055-8f48-af08374401fd\") " Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.805556 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9896132d-9ec8-4055-8f48-af08374401fd-logs" (OuterVolumeSpecName: "logs") pod "9896132d-9ec8-4055-8f48-af08374401fd" (UID: "9896132d-9ec8-4055-8f48-af08374401fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.806147 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9896132d-9ec8-4055-8f48-af08374401fd-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.818078 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9896132d-9ec8-4055-8f48-af08374401fd-kube-api-access-72kss" (OuterVolumeSpecName: "kube-api-access-72kss") pod "9896132d-9ec8-4055-8f48-af08374401fd" (UID: "9896132d-9ec8-4055-8f48-af08374401fd"). InnerVolumeSpecName "kube-api-access-72kss". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.827167 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9896132d-9ec8-4055-8f48-af08374401fd" (UID: "9896132d-9ec8-4055-8f48-af08374401fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.837761 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-config-data" (OuterVolumeSpecName: "config-data") pod "9896132d-9ec8-4055-8f48-af08374401fd" (UID: "9896132d-9ec8-4055-8f48-af08374401fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.908275 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.908319 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896132d-9ec8-4055-8f48-af08374401fd-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:52 crc kubenswrapper[4834]: I1126 12:27:52.908330 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72kss\" (UniqueName: \"kubernetes.io/projected/9896132d-9ec8-4055-8f48-af08374401fd-kube-api-access-72kss\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.340288 4834 generic.go:334] "Generic (PLEG): container finished" podID="9896132d-9ec8-4055-8f48-af08374401fd" containerID="a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d" exitCode=0 Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.340545 4834 generic.go:334] "Generic (PLEG): container finished" podID="9896132d-9ec8-4055-8f48-af08374401fd" containerID="136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e" exitCode=143 Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.340347 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9896132d-9ec8-4055-8f48-af08374401fd","Type":"ContainerDied","Data":"a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d"} Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.340360 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.340590 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9896132d-9ec8-4055-8f48-af08374401fd","Type":"ContainerDied","Data":"136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e"} Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.340609 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9896132d-9ec8-4055-8f48-af08374401fd","Type":"ContainerDied","Data":"513673236f80cf6a5e61dbad0a2d82f8447e0bcd0de68a9ab3d4c4c104c3b56e"} Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.340619 4834 scope.go:117] "RemoveContainer" containerID="a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.364199 4834 scope.go:117] "RemoveContainer" containerID="136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.369964 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.381818 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.391588 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:53 crc kubenswrapper[4834]: E1126 12:27:53.391973 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9896132d-9ec8-4055-8f48-af08374401fd" containerName="nova-metadata-log" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.391990 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9896132d-9ec8-4055-8f48-af08374401fd" containerName="nova-metadata-log" Nov 26 12:27:53 crc kubenswrapper[4834]: E1126 12:27:53.392008 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9896132d-9ec8-4055-8f48-af08374401fd" containerName="nova-metadata-metadata" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.392015 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9896132d-9ec8-4055-8f48-af08374401fd" containerName="nova-metadata-metadata" Nov 26 12:27:53 crc kubenswrapper[4834]: E1126 12:27:53.392032 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" containerName="registry-server" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.392038 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" containerName="registry-server" Nov 26 12:27:53 crc kubenswrapper[4834]: E1126 12:27:53.392063 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" containerName="extract-content" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.392068 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" containerName="extract-content" Nov 26 12:27:53 crc kubenswrapper[4834]: E1126 12:27:53.392079 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" containerName="extract-utilities" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.392084 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" containerName="extract-utilities" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.392244 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9896132d-9ec8-4055-8f48-af08374401fd" containerName="nova-metadata-log" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.392263 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" containerName="registry-server" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.392272 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9896132d-9ec8-4055-8f48-af08374401fd" containerName="nova-metadata-metadata" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.393206 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.396694 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.397046 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.400492 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.404556 4834 scope.go:117] "RemoveContainer" containerID="a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d" Nov 26 12:27:53 crc kubenswrapper[4834]: E1126 12:27:53.404880 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d\": container with ID starting with a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d not found: ID does not exist" containerID="a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.404912 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d"} err="failed to get container status \"a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d\": rpc error: code = NotFound desc = could not find container \"a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d\": container with ID starting with a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d not found: ID does not exist" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.404932 4834 scope.go:117] "RemoveContainer" containerID="136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e" Nov 26 12:27:53 crc kubenswrapper[4834]: E1126 12:27:53.405149 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e\": container with ID starting with 136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e not found: ID does not exist" containerID="136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.405172 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e"} err="failed to get container status \"136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e\": rpc error: code = NotFound desc = could not find container \"136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e\": container with ID starting with 136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e not found: ID does not exist" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.405184 4834 scope.go:117] "RemoveContainer" containerID="a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.405421 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d"} err="failed to get container status \"a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d\": rpc error: code = NotFound desc = could not find container \"a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d\": container with ID starting with a501a7c398acf2d6225e6b39351a437ba55da179e62739b8fe83ac985e758d0d not found: ID does not exist" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.405441 4834 scope.go:117] "RemoveContainer" containerID="136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.405627 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e"} err="failed to get container status \"136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e\": rpc error: code = NotFound desc = could not find container \"136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e\": container with ID starting with 136a7cd9c1c15fd733feed858ed0118bb03f41aca674ef3f0cf32c4c78e46e4e not found: ID does not exist" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.418582 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c697f0b-b130-4424-8309-d2874e94f4d8-logs\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.418656 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-config-data\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.418695 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.418715 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxbk\" (UniqueName: \"kubernetes.io/projected/5c697f0b-b130-4424-8309-d2874e94f4d8-kube-api-access-5cxbk\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.418782 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.519883 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c697f0b-b130-4424-8309-d2874e94f4d8-logs\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.519977 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-config-data\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.520010 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.520033 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxbk\" (UniqueName: \"kubernetes.io/projected/5c697f0b-b130-4424-8309-d2874e94f4d8-kube-api-access-5cxbk\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.520068 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.520274 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c697f0b-b130-4424-8309-d2874e94f4d8-logs\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.524928 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-config-data\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.524967 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.524997 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.535623 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxbk\" (UniqueName: \"kubernetes.io/projected/5c697f0b-b130-4424-8309-d2874e94f4d8-kube-api-access-5cxbk\") pod \"nova-metadata-0\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " pod="openstack/nova-metadata-0" Nov 26 12:27:53 crc kubenswrapper[4834]: I1126 12:27:53.719647 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:54 crc kubenswrapper[4834]: I1126 12:27:54.105176 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:54 crc kubenswrapper[4834]: I1126 12:27:54.349208 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c697f0b-b130-4424-8309-d2874e94f4d8","Type":"ContainerStarted","Data":"d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1"} Nov 26 12:27:54 crc kubenswrapper[4834]: I1126 12:27:54.349401 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c697f0b-b130-4424-8309-d2874e94f4d8","Type":"ContainerStarted","Data":"32971e92b5bec9e4f2b0341f7936516554fedca16d99ee1102d5469763faee5c"} Nov 26 12:27:54 crc kubenswrapper[4834]: I1126 12:27:54.426855 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52608f20-2e28-494a-82b8-4a3ffb789799" path="/var/lib/kubelet/pods/52608f20-2e28-494a-82b8-4a3ffb789799/volumes" Nov 26 12:27:54 crc kubenswrapper[4834]: I1126 12:27:54.428083 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9896132d-9ec8-4055-8f48-af08374401fd" path="/var/lib/kubelet/pods/9896132d-9ec8-4055-8f48-af08374401fd/volumes" Nov 26 12:27:55 crc kubenswrapper[4834]: I1126 12:27:55.359161 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c697f0b-b130-4424-8309-d2874e94f4d8","Type":"ContainerStarted","Data":"0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a"} Nov 26 12:27:55 crc kubenswrapper[4834]: I1126 12:27:55.365780 4834 generic.go:334] "Generic (PLEG): container finished" podID="163ea76c-b946-4d71-ab57-fc60b515cced" containerID="237ac414a7e30fbabaa31c28f26bdb44891e631402bca7a56cd5e08e0521f6fe" exitCode=0 Nov 26 12:27:55 crc kubenswrapper[4834]: I1126 12:27:55.365837 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gnszd" event={"ID":"163ea76c-b946-4d71-ab57-fc60b515cced","Type":"ContainerDied","Data":"237ac414a7e30fbabaa31c28f26bdb44891e631402bca7a56cd5e08e0521f6fe"} Nov 26 12:27:55 crc kubenswrapper[4834]: I1126 12:27:55.367922 4834 generic.go:334] "Generic (PLEG): container finished" podID="f54e62ed-7746-4227-957c-febe65052a53" containerID="b71b9afc11dd45f6015d69187b8a583d7a421bbd5b2d5db75e66cd6ac4b57ee5" exitCode=0 Nov 26 12:27:55 crc kubenswrapper[4834]: I1126 12:27:55.367947 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b558t" event={"ID":"f54e62ed-7746-4227-957c-febe65052a53","Type":"ContainerDied","Data":"b71b9afc11dd45f6015d69187b8a583d7a421bbd5b2d5db75e66cd6ac4b57ee5"} Nov 26 12:27:55 crc kubenswrapper[4834]: I1126 12:27:55.381439 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.381416312 podStartE2EDuration="2.381416312s" podCreationTimestamp="2025-11-26 12:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:27:55.379883821 +0000 UTC m=+973.287097173" watchObservedRunningTime="2025-11-26 12:27:55.381416312 +0000 UTC m=+973.288629663" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.624797 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.670187 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-config-data\") pod \"163ea76c-b946-4d71-ab57-fc60b515cced\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.670940 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdsp2\" (UniqueName: \"kubernetes.io/projected/163ea76c-b946-4d71-ab57-fc60b515cced-kube-api-access-qdsp2\") pod \"163ea76c-b946-4d71-ab57-fc60b515cced\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.671026 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-combined-ca-bundle\") pod \"163ea76c-b946-4d71-ab57-fc60b515cced\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.671287 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-scripts\") pod \"163ea76c-b946-4d71-ab57-fc60b515cced\" (UID: \"163ea76c-b946-4d71-ab57-fc60b515cced\") " Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.675273 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/163ea76c-b946-4d71-ab57-fc60b515cced-kube-api-access-qdsp2" (OuterVolumeSpecName: "kube-api-access-qdsp2") pod "163ea76c-b946-4d71-ab57-fc60b515cced" (UID: "163ea76c-b946-4d71-ab57-fc60b515cced"). InnerVolumeSpecName "kube-api-access-qdsp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.675339 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-scripts" (OuterVolumeSpecName: "scripts") pod "163ea76c-b946-4d71-ab57-fc60b515cced" (UID: "163ea76c-b946-4d71-ab57-fc60b515cced"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.690892 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "163ea76c-b946-4d71-ab57-fc60b515cced" (UID: "163ea76c-b946-4d71-ab57-fc60b515cced"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.692031 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-config-data" (OuterVolumeSpecName: "config-data") pod "163ea76c-b946-4d71-ab57-fc60b515cced" (UID: "163ea76c-b946-4d71-ab57-fc60b515cced"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.748184 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.772548 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-config-data\") pod \"f54e62ed-7746-4227-957c-febe65052a53\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.772606 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-scripts\") pod \"f54e62ed-7746-4227-957c-febe65052a53\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.772675 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvmqd\" (UniqueName: \"kubernetes.io/projected/f54e62ed-7746-4227-957c-febe65052a53-kube-api-access-pvmqd\") pod \"f54e62ed-7746-4227-957c-febe65052a53\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.772788 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-combined-ca-bundle\") pod \"f54e62ed-7746-4227-957c-febe65052a53\" (UID: \"f54e62ed-7746-4227-957c-febe65052a53\") " Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.773132 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdsp2\" (UniqueName: \"kubernetes.io/projected/163ea76c-b946-4d71-ab57-fc60b515cced-kube-api-access-qdsp2\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.773143 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.773154 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.773166 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/163ea76c-b946-4d71-ab57-fc60b515cced-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.776628 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-scripts" (OuterVolumeSpecName: "scripts") pod "f54e62ed-7746-4227-957c-febe65052a53" (UID: "f54e62ed-7746-4227-957c-febe65052a53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.777181 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54e62ed-7746-4227-957c-febe65052a53-kube-api-access-pvmqd" (OuterVolumeSpecName: "kube-api-access-pvmqd") pod "f54e62ed-7746-4227-957c-febe65052a53" (UID: "f54e62ed-7746-4227-957c-febe65052a53"). InnerVolumeSpecName "kube-api-access-pvmqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.793519 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-config-data" (OuterVolumeSpecName: "config-data") pod "f54e62ed-7746-4227-957c-febe65052a53" (UID: "f54e62ed-7746-4227-957c-febe65052a53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.794244 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f54e62ed-7746-4227-957c-febe65052a53" (UID: "f54e62ed-7746-4227-957c-febe65052a53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.873853 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvmqd\" (UniqueName: \"kubernetes.io/projected/f54e62ed-7746-4227-957c-febe65052a53-kube-api-access-pvmqd\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.873877 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.873886 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:56 crc kubenswrapper[4834]: I1126 12:27:56.873894 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54e62ed-7746-4227-957c-febe65052a53-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.383609 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gnszd" event={"ID":"163ea76c-b946-4d71-ab57-fc60b515cced","Type":"ContainerDied","Data":"98657720c7f3b5a394b8a66424a4aae425a1c5b01a43d0137d12d3b02603bceb"} Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.384579 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98657720c7f3b5a394b8a66424a4aae425a1c5b01a43d0137d12d3b02603bceb" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.383633 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gnszd" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.385443 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b558t" event={"ID":"f54e62ed-7746-4227-957c-febe65052a53","Type":"ContainerDied","Data":"56f116b952511886b9dfebc17bce73c5374dac9900b6cf4d35e2d760b2f49ae1"} Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.385483 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56f116b952511886b9dfebc17bce73c5374dac9900b6cf4d35e2d760b2f49ae1" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.385489 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b558t" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.428636 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.456821 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.474799 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 12:27:57 crc kubenswrapper[4834]: E1126 12:27:57.475126 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="163ea76c-b946-4d71-ab57-fc60b515cced" containerName="nova-cell1-conductor-db-sync" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.475142 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="163ea76c-b946-4d71-ab57-fc60b515cced" containerName="nova-cell1-conductor-db-sync" Nov 26 12:27:57 crc kubenswrapper[4834]: E1126 12:27:57.475160 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54e62ed-7746-4227-957c-febe65052a53" containerName="nova-manage" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.475166 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54e62ed-7746-4227-957c-febe65052a53" containerName="nova-manage" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.475354 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="163ea76c-b946-4d71-ab57-fc60b515cced" containerName="nova-cell1-conductor-db-sync" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.475378 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54e62ed-7746-4227-957c-febe65052a53" containerName="nova-manage" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.475913 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.480791 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.488026 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.557502 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.557915 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="26fba220-facf-4b30-8525-a923cc54bda7" containerName="nova-api-log" containerID="cri-o://2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb" gracePeriod=30 Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.558012 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="26fba220-facf-4b30-8525-a923cc54bda7" containerName="nova-api-api" containerID="cri-o://75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b" gracePeriod=30 Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.563430 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.580116 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.580356 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerName="nova-metadata-log" containerID="cri-o://d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1" gracePeriod=30 Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.580776 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerName="nova-metadata-metadata" containerID="cri-o://0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a" gracePeriod=30 Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.588284 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1632f2-4571-4112-83e2-c0bc5fa90d3e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8c1632f2-4571-4112-83e2-c0bc5fa90d3e\") " pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.588471 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1632f2-4571-4112-83e2-c0bc5fa90d3e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8c1632f2-4571-4112-83e2-c0bc5fa90d3e\") " pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.588602 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd2nl\" (UniqueName: \"kubernetes.io/projected/8c1632f2-4571-4112-83e2-c0bc5fa90d3e-kube-api-access-xd2nl\") pod \"nova-cell1-conductor-0\" (UID: \"8c1632f2-4571-4112-83e2-c0bc5fa90d3e\") " pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.646293 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-22644"] Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.648330 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.656329 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-22644"] Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.672443 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.690415 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-utilities\") pod \"certified-operators-22644\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.690503 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5pjp\" (UniqueName: \"kubernetes.io/projected/9442f819-8959-4c6c-8373-385d5b15ac7c-kube-api-access-m5pjp\") pod \"certified-operators-22644\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.690546 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1632f2-4571-4112-83e2-c0bc5fa90d3e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8c1632f2-4571-4112-83e2-c0bc5fa90d3e\") " pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.690569 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1632f2-4571-4112-83e2-c0bc5fa90d3e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8c1632f2-4571-4112-83e2-c0bc5fa90d3e\") " pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.690598 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd2nl\" (UniqueName: \"kubernetes.io/projected/8c1632f2-4571-4112-83e2-c0bc5fa90d3e-kube-api-access-xd2nl\") pod \"nova-cell1-conductor-0\" (UID: \"8c1632f2-4571-4112-83e2-c0bc5fa90d3e\") " pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.690613 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-catalog-content\") pod \"certified-operators-22644\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.696024 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1632f2-4571-4112-83e2-c0bc5fa90d3e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8c1632f2-4571-4112-83e2-c0bc5fa90d3e\") " pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.711730 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1632f2-4571-4112-83e2-c0bc5fa90d3e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8c1632f2-4571-4112-83e2-c0bc5fa90d3e\") " pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.713930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd2nl\" (UniqueName: \"kubernetes.io/projected/8c1632f2-4571-4112-83e2-c0bc5fa90d3e-kube-api-access-xd2nl\") pod \"nova-cell1-conductor-0\" (UID: \"8c1632f2-4571-4112-83e2-c0bc5fa90d3e\") " pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.729947 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc89f58d7-tg9dw"] Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.731537 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" podUID="2281f6ae-aa77-445f-9f3e-4c15ce93debf" containerName="dnsmasq-dns" containerID="cri-o://6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4" gracePeriod=10 Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.792820 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-catalog-content\") pod \"certified-operators-22644\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.792977 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-utilities\") pod \"certified-operators-22644\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.793049 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5pjp\" (UniqueName: \"kubernetes.io/projected/9442f819-8959-4c6c-8373-385d5b15ac7c-kube-api-access-m5pjp\") pod \"certified-operators-22644\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.793416 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-catalog-content\") pod \"certified-operators-22644\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.793468 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-utilities\") pod \"certified-operators-22644\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.795154 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.815493 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5pjp\" (UniqueName: \"kubernetes.io/projected/9442f819-8959-4c6c-8373-385d5b15ac7c-kube-api-access-m5pjp\") pod \"certified-operators-22644\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:57 crc kubenswrapper[4834]: I1126 12:27:57.973961 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-22644" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.124434 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.258132 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.304989 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c697f0b-b130-4424-8309-d2874e94f4d8-logs\") pod \"5c697f0b-b130-4424-8309-d2874e94f4d8\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.305093 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-config-data\") pod \"5c697f0b-b130-4424-8309-d2874e94f4d8\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.305116 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cxbk\" (UniqueName: \"kubernetes.io/projected/5c697f0b-b130-4424-8309-d2874e94f4d8-kube-api-access-5cxbk\") pod \"5c697f0b-b130-4424-8309-d2874e94f4d8\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.305329 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-combined-ca-bundle\") pod \"5c697f0b-b130-4424-8309-d2874e94f4d8\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.305365 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-nova-metadata-tls-certs\") pod \"5c697f0b-b130-4424-8309-d2874e94f4d8\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.305423 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c697f0b-b130-4424-8309-d2874e94f4d8-logs" (OuterVolumeSpecName: "logs") pod "5c697f0b-b130-4424-8309-d2874e94f4d8" (UID: "5c697f0b-b130-4424-8309-d2874e94f4d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.305748 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c697f0b-b130-4424-8309-d2874e94f4d8-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.309474 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c697f0b-b130-4424-8309-d2874e94f4d8-kube-api-access-5cxbk" (OuterVolumeSpecName: "kube-api-access-5cxbk") pod "5c697f0b-b130-4424-8309-d2874e94f4d8" (UID: "5c697f0b-b130-4424-8309-d2874e94f4d8"). InnerVolumeSpecName "kube-api-access-5cxbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.320397 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.333240 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-config-data" (OuterVolumeSpecName: "config-data") pod "5c697f0b-b130-4424-8309-d2874e94f4d8" (UID: "5c697f0b-b130-4424-8309-d2874e94f4d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.354646 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c697f0b-b130-4424-8309-d2874e94f4d8" (UID: "5c697f0b-b130-4424-8309-d2874e94f4d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.402413 4834 generic.go:334] "Generic (PLEG): container finished" podID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerID="0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a" exitCode=0 Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.402440 4834 generic.go:334] "Generic (PLEG): container finished" podID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerID="d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1" exitCode=143 Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.402480 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c697f0b-b130-4424-8309-d2874e94f4d8","Type":"ContainerDied","Data":"0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a"} Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.402505 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c697f0b-b130-4424-8309-d2874e94f4d8","Type":"ContainerDied","Data":"d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1"} Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.402515 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c697f0b-b130-4424-8309-d2874e94f4d8","Type":"ContainerDied","Data":"32971e92b5bec9e4f2b0341f7936516554fedca16d99ee1102d5469763faee5c"} Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.402529 4834 scope.go:117] "RemoveContainer" containerID="0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.402641 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.407280 4834 generic.go:334] "Generic (PLEG): container finished" podID="2281f6ae-aa77-445f-9f3e-4c15ce93debf" containerID="6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4" exitCode=0 Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.407456 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" event={"ID":"2281f6ae-aa77-445f-9f3e-4c15ce93debf","Type":"ContainerDied","Data":"6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4"} Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.407536 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" event={"ID":"2281f6ae-aa77-445f-9f3e-4c15ce93debf","Type":"ContainerDied","Data":"9cd308f73bdb257d6ae523b01c8f600b335ea81473b5aa9f39a3e5379a2390da"} Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.407652 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc89f58d7-tg9dw" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.408411 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5c697f0b-b130-4424-8309-d2874e94f4d8" (UID: "5c697f0b-b130-4424-8309-d2874e94f4d8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.408751 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fba220-facf-4b30-8525-a923cc54bda7-logs\") pod \"26fba220-facf-4b30-8525-a923cc54bda7\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.408849 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-nova-metadata-tls-certs\") pod \"5c697f0b-b130-4424-8309-d2874e94f4d8\" (UID: \"5c697f0b-b130-4424-8309-d2874e94f4d8\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.408885 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-config-data\") pod \"26fba220-facf-4b30-8525-a923cc54bda7\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.409023 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8bh7\" (UniqueName: \"kubernetes.io/projected/26fba220-facf-4b30-8525-a923cc54bda7-kube-api-access-w8bh7\") pod \"26fba220-facf-4b30-8525-a923cc54bda7\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.409050 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-combined-ca-bundle\") pod \"26fba220-facf-4b30-8525-a923cc54bda7\" (UID: \"26fba220-facf-4b30-8525-a923cc54bda7\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.409549 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.409562 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.409571 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cxbk\" (UniqueName: \"kubernetes.io/projected/5c697f0b-b130-4424-8309-d2874e94f4d8-kube-api-access-5cxbk\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: W1126 12:27:58.409965 4834 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5c697f0b-b130-4424-8309-d2874e94f4d8/volumes/kubernetes.io~secret/nova-metadata-tls-certs Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.409978 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5c697f0b-b130-4424-8309-d2874e94f4d8" (UID: "5c697f0b-b130-4424-8309-d2874e94f4d8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.410353 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26fba220-facf-4b30-8525-a923cc54bda7-logs" (OuterVolumeSpecName: "logs") pod "26fba220-facf-4b30-8525-a923cc54bda7" (UID: "26fba220-facf-4b30-8525-a923cc54bda7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.413232 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26fba220-facf-4b30-8525-a923cc54bda7-kube-api-access-w8bh7" (OuterVolumeSpecName: "kube-api-access-w8bh7") pod "26fba220-facf-4b30-8525-a923cc54bda7" (UID: "26fba220-facf-4b30-8525-a923cc54bda7"). InnerVolumeSpecName "kube-api-access-w8bh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.413596 4834 generic.go:334] "Generic (PLEG): container finished" podID="26fba220-facf-4b30-8525-a923cc54bda7" containerID="75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b" exitCode=0 Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.413613 4834 generic.go:334] "Generic (PLEG): container finished" podID="26fba220-facf-4b30-8525-a923cc54bda7" containerID="2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb" exitCode=143 Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.414252 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.414482 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26fba220-facf-4b30-8525-a923cc54bda7","Type":"ContainerDied","Data":"75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b"} Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.414560 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26fba220-facf-4b30-8525-a923cc54bda7","Type":"ContainerDied","Data":"2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb"} Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.414618 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26fba220-facf-4b30-8525-a923cc54bda7","Type":"ContainerDied","Data":"c0699ce7d7dcbd54bc6fc3c67ad251cd539daf831bdc0a951f3dee8e0367935a"} Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.457907 4834 scope.go:117] "RemoveContainer" containerID="d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.458002 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.481692 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26fba220-facf-4b30-8525-a923cc54bda7" (UID: "26fba220-facf-4b30-8525-a923cc54bda7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.492396 4834 scope.go:117] "RemoveContainer" containerID="0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.492760 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a\": container with ID starting with 0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a not found: ID does not exist" containerID="0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.492783 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a"} err="failed to get container status \"0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a\": rpc error: code = NotFound desc = could not find container \"0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a\": container with ID starting with 0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.492803 4834 scope.go:117] "RemoveContainer" containerID="d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.498381 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1\": container with ID starting with d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1 not found: ID does not exist" containerID="d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.498407 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1"} err="failed to get container status \"d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1\": rpc error: code = NotFound desc = could not find container \"d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1\": container with ID starting with d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1 not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.498423 4834 scope.go:117] "RemoveContainer" containerID="0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.503451 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a"} err="failed to get container status \"0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a\": rpc error: code = NotFound desc = could not find container \"0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a\": container with ID starting with 0756906be2c3c7f68e3beb36d7b35cc3f7d9a09e51c27fa243766939091aae6a not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.503475 4834 scope.go:117] "RemoveContainer" containerID="d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.503705 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1"} err="failed to get container status \"d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1\": rpc error: code = NotFound desc = could not find container \"d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1\": container with ID starting with d53011540841cc7669218c5b499b82aa70842a324f620101515ea9edf744aec1 not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.503724 4834 scope.go:117] "RemoveContainer" containerID="6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.503786 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-config-data" (OuterVolumeSpecName: "config-data") pod "26fba220-facf-4b30-8525-a923cc54bda7" (UID: "26fba220-facf-4b30-8525-a923cc54bda7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.510613 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-nb\") pod \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.510692 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-config\") pod \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.510716 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-sb\") pod \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.510795 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsggf\" (UniqueName: \"kubernetes.io/projected/2281f6ae-aa77-445f-9f3e-4c15ce93debf-kube-api-access-jsggf\") pod \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.510941 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-dns-svc\") pod \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\" (UID: \"2281f6ae-aa77-445f-9f3e-4c15ce93debf\") " Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.511780 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8bh7\" (UniqueName: \"kubernetes.io/projected/26fba220-facf-4b30-8525-a923cc54bda7-kube-api-access-w8bh7\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.511799 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.511808 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fba220-facf-4b30-8525-a923cc54bda7-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.511816 4834 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c697f0b-b130-4424-8309-d2874e94f4d8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.511825 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fba220-facf-4b30-8525-a923cc54bda7-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.523795 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2281f6ae-aa77-445f-9f3e-4c15ce93debf-kube-api-access-jsggf" (OuterVolumeSpecName: "kube-api-access-jsggf") pod "2281f6ae-aa77-445f-9f3e-4c15ce93debf" (UID: "2281f6ae-aa77-445f-9f3e-4c15ce93debf"). InnerVolumeSpecName "kube-api-access-jsggf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.551769 4834 scope.go:117] "RemoveContainer" containerID="06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.565397 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-22644"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.568862 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2281f6ae-aa77-445f-9f3e-4c15ce93debf" (UID: "2281f6ae-aa77-445f-9f3e-4c15ce93debf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.577132 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.585476 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2281f6ae-aa77-445f-9f3e-4c15ce93debf" (UID: "2281f6ae-aa77-445f-9f3e-4c15ce93debf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.626046 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-config" (OuterVolumeSpecName: "config") pod "2281f6ae-aa77-445f-9f3e-4c15ce93debf" (UID: "2281f6ae-aa77-445f-9f3e-4c15ce93debf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.629906 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.630194 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsggf\" (UniqueName: \"kubernetes.io/projected/2281f6ae-aa77-445f-9f3e-4c15ce93debf-kube-api-access-jsggf\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.630221 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.668142 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2281f6ae-aa77-445f-9f3e-4c15ce93debf" (UID: "2281f6ae-aa77-445f-9f3e-4c15ce93debf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.734492 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.734521 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2281f6ae-aa77-445f-9f3e-4c15ce93debf-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.742333 4834 scope.go:117] "RemoveContainer" containerID="6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.742825 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4\": container with ID starting with 6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4 not found: ID does not exist" containerID="6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.742870 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4"} err="failed to get container status \"6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4\": rpc error: code = NotFound desc = could not find container \"6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4\": container with ID starting with 6e46e3ace111394761cfb702f596955b607d38f678d1983b0ff4c76575d2a2c4 not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.742895 4834 scope.go:117] "RemoveContainer" containerID="06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.744048 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a\": container with ID starting with 06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a not found: ID does not exist" containerID="06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.744100 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a"} err="failed to get container status \"06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a\": rpc error: code = NotFound desc = could not find container \"06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a\": container with ID starting with 06fff193fe95537d0f2177dfab43b021667bd44b8d5b4cde92ceb04ba4af961a not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.744127 4834 scope.go:117] "RemoveContainer" containerID="75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.771850 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.775015 4834 scope.go:117] "RemoveContainer" containerID="2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.784930 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.803663 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.805466 4834 scope.go:117] "RemoveContainer" containerID="75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.805891 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b\": container with ID starting with 75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b not found: ID does not exist" containerID="75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.805925 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b"} err="failed to get container status \"75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b\": rpc error: code = NotFound desc = could not find container \"75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b\": container with ID starting with 75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.805948 4834 scope.go:117] "RemoveContainer" containerID="2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.809203 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb\": container with ID starting with 2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb not found: ID does not exist" containerID="2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.809237 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb"} err="failed to get container status \"2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb\": rpc error: code = NotFound desc = could not find container \"2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb\": container with ID starting with 2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.809255 4834 scope.go:117] "RemoveContainer" containerID="75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.809649 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b"} err="failed to get container status \"75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b\": rpc error: code = NotFound desc = could not find container \"75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b\": container with ID starting with 75b5f6ddc75292b7b87030e6cee56e0baae79a7a75f96100c3df5c2cd416d34b not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.809702 4834 scope.go:117] "RemoveContainer" containerID="2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.809894 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.811123 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb"} err="failed to get container status \"2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb\": rpc error: code = NotFound desc = could not find container \"2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb\": container with ID starting with 2f851f1e5599a2ddfb28af1b80c924d908f766eea4f523fd385efafd520156eb not found: ID does not exist" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.815690 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.816072 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerName="nova-metadata-metadata" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816099 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerName="nova-metadata-metadata" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.816111 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2281f6ae-aa77-445f-9f3e-4c15ce93debf" containerName="dnsmasq-dns" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816119 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2281f6ae-aa77-445f-9f3e-4c15ce93debf" containerName="dnsmasq-dns" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.816139 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2281f6ae-aa77-445f-9f3e-4c15ce93debf" containerName="init" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816144 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2281f6ae-aa77-445f-9f3e-4c15ce93debf" containerName="init" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.816160 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerName="nova-metadata-log" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816165 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerName="nova-metadata-log" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.816178 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26fba220-facf-4b30-8525-a923cc54bda7" containerName="nova-api-log" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816183 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fba220-facf-4b30-8525-a923cc54bda7" containerName="nova-api-log" Nov 26 12:27:58 crc kubenswrapper[4834]: E1126 12:27:58.816189 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26fba220-facf-4b30-8525-a923cc54bda7" containerName="nova-api-api" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816197 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fba220-facf-4b30-8525-a923cc54bda7" containerName="nova-api-api" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816356 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerName="nova-metadata-log" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816366 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2281f6ae-aa77-445f-9f3e-4c15ce93debf" containerName="dnsmasq-dns" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816375 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c697f0b-b130-4424-8309-d2874e94f4d8" containerName="nova-metadata-metadata" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816389 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="26fba220-facf-4b30-8525-a923cc54bda7" containerName="nova-api-log" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.816398 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="26fba220-facf-4b30-8525-a923cc54bda7" containerName="nova-api-api" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.817259 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.827246 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.827698 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc89f58d7-tg9dw"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.828042 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.833229 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bc89f58d7-tg9dw"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.840556 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.851417 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.852734 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.857127 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.857253 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.936943 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hznm8\" (UniqueName: \"kubernetes.io/projected/6faa1535-1c01-410b-a1b0-a462683941f2-kube-api-access-hznm8\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.937027 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.937054 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6faa1535-1c01-410b-a1b0-a462683941f2-logs\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.937126 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-config-data\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:58 crc kubenswrapper[4834]: I1126 12:27:58.937155 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.040616 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hznm8\" (UniqueName: \"kubernetes.io/projected/6faa1535-1c01-410b-a1b0-a462683941f2-kube-api-access-hznm8\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.041234 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.041278 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.041331 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6faa1535-1c01-410b-a1b0-a462683941f2-logs\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.041480 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh9p8\" (UniqueName: \"kubernetes.io/projected/b0483aad-c574-4d04-bf6a-7b30f49b4214-kube-api-access-qh9p8\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.041532 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-config-data\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.041588 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.041623 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-config-data\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.041686 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0483aad-c574-4d04-bf6a-7b30f49b4214-logs\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.041883 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6faa1535-1c01-410b-a1b0-a462683941f2-logs\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.046256 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.048618 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-config-data\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.054366 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.056100 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hznm8\" (UniqueName: \"kubernetes.io/projected/6faa1535-1c01-410b-a1b0-a462683941f2-kube-api-access-hznm8\") pod \"nova-metadata-0\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.143685 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.143750 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh9p8\" (UniqueName: \"kubernetes.io/projected/b0483aad-c574-4d04-bf6a-7b30f49b4214-kube-api-access-qh9p8\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.143793 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-config-data\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.143822 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0483aad-c574-4d04-bf6a-7b30f49b4214-logs\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.144204 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0483aad-c574-4d04-bf6a-7b30f49b4214-logs\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.144750 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.147379 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.147686 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-config-data\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.157496 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh9p8\" (UniqueName: \"kubernetes.io/projected/b0483aad-c574-4d04-bf6a-7b30f49b4214-kube-api-access-qh9p8\") pod \"nova-api-0\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.164966 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.433858 4834 generic.go:334] "Generic (PLEG): container finished" podID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerID="a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096" exitCode=0 Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.434136 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22644" event={"ID":"9442f819-8959-4c6c-8373-385d5b15ac7c","Type":"ContainerDied","Data":"a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096"} Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.434175 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22644" event={"ID":"9442f819-8959-4c6c-8373-385d5b15ac7c","Type":"ContainerStarted","Data":"2325cfb7c9246b72755933f8bfb5942fc543f556525341461b81fc87f3d4f4be"} Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.436670 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8c1632f2-4571-4112-83e2-c0bc5fa90d3e","Type":"ContainerStarted","Data":"30b264ce2200e6a2f37210e4e80f440fc16b1a91bf90e64c6d54204152bfb94c"} Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.436720 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8c1632f2-4571-4112-83e2-c0bc5fa90d3e","Type":"ContainerStarted","Data":"9b3692ba5b19ea3c19b89bd4ff59b1a46a543b58104c2f4ac491e78d7fbc0278"} Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.437255 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.440664 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9d680001-4cac-43d0-87d1-d42f100ce726" containerName="nova-scheduler-scheduler" containerID="cri-o://37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c" gracePeriod=30 Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.472962 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.47295077 podStartE2EDuration="2.47295077s" podCreationTimestamp="2025-11-26 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:27:59.469186054 +0000 UTC m=+977.376399405" watchObservedRunningTime="2025-11-26 12:27:59.47295077 +0000 UTC m=+977.380164123" Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.562207 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:27:59 crc kubenswrapper[4834]: I1126 12:27:59.606113 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:27:59 crc kubenswrapper[4834]: W1126 12:27:59.606593 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0483aad_c574_4d04_bf6a_7b30f49b4214.slice/crio-7171ad256948a4c277a27f0947b5b274fcc49eef7999561f70481dc3debb30dd WatchSource:0}: Error finding container 7171ad256948a4c277a27f0947b5b274fcc49eef7999561f70481dc3debb30dd: Status 404 returned error can't find the container with id 7171ad256948a4c277a27f0947b5b274fcc49eef7999561f70481dc3debb30dd Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.424829 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2281f6ae-aa77-445f-9f3e-4c15ce93debf" path="/var/lib/kubelet/pods/2281f6ae-aa77-445f-9f3e-4c15ce93debf/volumes" Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.425654 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26fba220-facf-4b30-8525-a923cc54bda7" path="/var/lib/kubelet/pods/26fba220-facf-4b30-8525-a923cc54bda7/volumes" Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.426199 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c697f0b-b130-4424-8309-d2874e94f4d8" path="/var/lib/kubelet/pods/5c697f0b-b130-4424-8309-d2874e94f4d8/volumes" Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.448030 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6faa1535-1c01-410b-a1b0-a462683941f2","Type":"ContainerStarted","Data":"d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e"} Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.448097 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6faa1535-1c01-410b-a1b0-a462683941f2","Type":"ContainerStarted","Data":"d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb"} Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.448113 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6faa1535-1c01-410b-a1b0-a462683941f2","Type":"ContainerStarted","Data":"77ed8021b071f206a0e7a9f7f62af358abd3533bb1537869d83c30b5d3d381f3"} Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.450565 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b0483aad-c574-4d04-bf6a-7b30f49b4214","Type":"ContainerStarted","Data":"661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd"} Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.450615 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b0483aad-c574-4d04-bf6a-7b30f49b4214","Type":"ContainerStarted","Data":"1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687"} Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.450628 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b0483aad-c574-4d04-bf6a-7b30f49b4214","Type":"ContainerStarted","Data":"7171ad256948a4c277a27f0947b5b274fcc49eef7999561f70481dc3debb30dd"} Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.469174 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4691531429999998 podStartE2EDuration="2.469153143s" podCreationTimestamp="2025-11-26 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:00.46094976 +0000 UTC m=+978.368163112" watchObservedRunningTime="2025-11-26 12:28:00.469153143 +0000 UTC m=+978.376366495" Nov 26 12:28:00 crc kubenswrapper[4834]: I1126 12:28:00.486690 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.486668179 podStartE2EDuration="2.486668179s" podCreationTimestamp="2025-11-26 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:00.474437254 +0000 UTC m=+978.381650606" watchObservedRunningTime="2025-11-26 12:28:00.486668179 +0000 UTC m=+978.393881531" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.324882 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.469117 4834 generic.go:334] "Generic (PLEG): container finished" podID="9d680001-4cac-43d0-87d1-d42f100ce726" containerID="37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c" exitCode=0 Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.469215 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.469621 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d680001-4cac-43d0-87d1-d42f100ce726","Type":"ContainerDied","Data":"37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c"} Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.469665 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d680001-4cac-43d0-87d1-d42f100ce726","Type":"ContainerDied","Data":"d314496789a438b9a2f30b0d0c702cf4c490ac24f97d01816197212ddb452c25"} Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.469684 4834 scope.go:117] "RemoveContainer" containerID="37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.472639 4834 generic.go:334] "Generic (PLEG): container finished" podID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerID="f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd" exitCode=0 Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.472680 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22644" event={"ID":"9442f819-8959-4c6c-8373-385d5b15ac7c","Type":"ContainerDied","Data":"f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd"} Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.504675 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-config-data\") pod \"9d680001-4cac-43d0-87d1-d42f100ce726\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.505125 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-combined-ca-bundle\") pod \"9d680001-4cac-43d0-87d1-d42f100ce726\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.505188 4834 scope.go:117] "RemoveContainer" containerID="37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.505206 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcjx\" (UniqueName: \"kubernetes.io/projected/9d680001-4cac-43d0-87d1-d42f100ce726-kube-api-access-qmcjx\") pod \"9d680001-4cac-43d0-87d1-d42f100ce726\" (UID: \"9d680001-4cac-43d0-87d1-d42f100ce726\") " Nov 26 12:28:02 crc kubenswrapper[4834]: E1126 12:28:02.509427 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c\": container with ID starting with 37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c not found: ID does not exist" containerID="37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.509459 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c"} err="failed to get container status \"37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c\": rpc error: code = NotFound desc = could not find container \"37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c\": container with ID starting with 37bfaaa0336fe6f644c2df696e6d94170013f903282ae442397ac142f181342c not found: ID does not exist" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.509703 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d680001-4cac-43d0-87d1-d42f100ce726-kube-api-access-qmcjx" (OuterVolumeSpecName: "kube-api-access-qmcjx") pod "9d680001-4cac-43d0-87d1-d42f100ce726" (UID: "9d680001-4cac-43d0-87d1-d42f100ce726"). InnerVolumeSpecName "kube-api-access-qmcjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.528243 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-config-data" (OuterVolumeSpecName: "config-data") pod "9d680001-4cac-43d0-87d1-d42f100ce726" (UID: "9d680001-4cac-43d0-87d1-d42f100ce726"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.528821 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d680001-4cac-43d0-87d1-d42f100ce726" (UID: "9d680001-4cac-43d0-87d1-d42f100ce726"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.607262 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.607299 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmcjx\" (UniqueName: \"kubernetes.io/projected/9d680001-4cac-43d0-87d1-d42f100ce726-kube-api-access-qmcjx\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.607332 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d680001-4cac-43d0-87d1-d42f100ce726-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.794550 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.799734 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.818399 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:02 crc kubenswrapper[4834]: E1126 12:28:02.818995 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d680001-4cac-43d0-87d1-d42f100ce726" containerName="nova-scheduler-scheduler" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.819093 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d680001-4cac-43d0-87d1-d42f100ce726" containerName="nova-scheduler-scheduler" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.819425 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d680001-4cac-43d0-87d1-d42f100ce726" containerName="nova-scheduler-scheduler" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.820290 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.822651 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.829864 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.916457 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xzc9\" (UniqueName: \"kubernetes.io/projected/0fb89061-9816-4975-9e60-a3fafaf6d334-kube-api-access-9xzc9\") pod \"nova-scheduler-0\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.916532 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:02 crc kubenswrapper[4834]: I1126 12:28:02.916560 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-config-data\") pod \"nova-scheduler-0\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:03 crc kubenswrapper[4834]: I1126 12:28:03.018724 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xzc9\" (UniqueName: \"kubernetes.io/projected/0fb89061-9816-4975-9e60-a3fafaf6d334-kube-api-access-9xzc9\") pod \"nova-scheduler-0\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:03 crc kubenswrapper[4834]: I1126 12:28:03.018762 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:03 crc kubenswrapper[4834]: I1126 12:28:03.018798 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-config-data\") pod \"nova-scheduler-0\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:03 crc kubenswrapper[4834]: I1126 12:28:03.022282 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:03 crc kubenswrapper[4834]: I1126 12:28:03.022354 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-config-data\") pod \"nova-scheduler-0\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:03 crc kubenswrapper[4834]: I1126 12:28:03.031557 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xzc9\" (UniqueName: \"kubernetes.io/projected/0fb89061-9816-4975-9e60-a3fafaf6d334-kube-api-access-9xzc9\") pod \"nova-scheduler-0\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:03 crc kubenswrapper[4834]: I1126 12:28:03.136457 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:28:03 crc kubenswrapper[4834]: I1126 12:28:03.501532 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:04 crc kubenswrapper[4834]: I1126 12:28:04.145763 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 12:28:04 crc kubenswrapper[4834]: I1126 12:28:04.145999 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 12:28:04 crc kubenswrapper[4834]: I1126 12:28:04.425048 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d680001-4cac-43d0-87d1-d42f100ce726" path="/var/lib/kubelet/pods/9d680001-4cac-43d0-87d1-d42f100ce726/volumes" Nov 26 12:28:04 crc kubenswrapper[4834]: I1126 12:28:04.488972 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22644" event={"ID":"9442f819-8959-4c6c-8373-385d5b15ac7c","Type":"ContainerStarted","Data":"c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11"} Nov 26 12:28:04 crc kubenswrapper[4834]: I1126 12:28:04.491535 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0fb89061-9816-4975-9e60-a3fafaf6d334","Type":"ContainerStarted","Data":"4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682"} Nov 26 12:28:04 crc kubenswrapper[4834]: I1126 12:28:04.491566 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0fb89061-9816-4975-9e60-a3fafaf6d334","Type":"ContainerStarted","Data":"b3be5d978a403b1838f937dd7f267a5c9c8187a8dac0919ac09dc8c916eb7727"} Nov 26 12:28:04 crc kubenswrapper[4834]: I1126 12:28:04.507135 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-22644" podStartSLOduration=3.560459635 podStartE2EDuration="7.507123773s" podCreationTimestamp="2025-11-26 12:27:57 +0000 UTC" firstStartedPulling="2025-11-26 12:27:59.436102382 +0000 UTC m=+977.343315734" lastFinishedPulling="2025-11-26 12:28:03.382766531 +0000 UTC m=+981.289979872" observedRunningTime="2025-11-26 12:28:04.501248759 +0000 UTC m=+982.408462111" watchObservedRunningTime="2025-11-26 12:28:04.507123773 +0000 UTC m=+982.414337125" Nov 26 12:28:04 crc kubenswrapper[4834]: I1126 12:28:04.519932 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5199163159999998 podStartE2EDuration="2.519916316s" podCreationTimestamp="2025-11-26 12:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:04.51330266 +0000 UTC m=+982.420516012" watchObservedRunningTime="2025-11-26 12:28:04.519916316 +0000 UTC m=+982.427129668" Nov 26 12:28:07 crc kubenswrapper[4834]: I1126 12:28:07.815928 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 26 12:28:07 crc kubenswrapper[4834]: I1126 12:28:07.975099 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-22644" Nov 26 12:28:07 crc kubenswrapper[4834]: I1126 12:28:07.975201 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-22644" Nov 26 12:28:08 crc kubenswrapper[4834]: I1126 12:28:08.010178 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-22644" Nov 26 12:28:08 crc kubenswrapper[4834]: I1126 12:28:08.136663 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 26 12:28:08 crc kubenswrapper[4834]: I1126 12:28:08.547749 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-22644" Nov 26 12:28:08 crc kubenswrapper[4834]: I1126 12:28:08.583795 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-22644"] Nov 26 12:28:09 crc kubenswrapper[4834]: I1126 12:28:09.145085 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 12:28:09 crc kubenswrapper[4834]: I1126 12:28:09.145536 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 12:28:09 crc kubenswrapper[4834]: I1126 12:28:09.165651 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 12:28:09 crc kubenswrapper[4834]: I1126 12:28:09.165689 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.158417 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.158435 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.247454 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.247476 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.544661 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-22644" podUID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerName="registry-server" containerID="cri-o://c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11" gracePeriod=2 Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.930075 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-22644" Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.969843 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5pjp\" (UniqueName: \"kubernetes.io/projected/9442f819-8959-4c6c-8373-385d5b15ac7c-kube-api-access-m5pjp\") pod \"9442f819-8959-4c6c-8373-385d5b15ac7c\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.969974 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-catalog-content\") pod \"9442f819-8959-4c6c-8373-385d5b15ac7c\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.970189 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-utilities\") pod \"9442f819-8959-4c6c-8373-385d5b15ac7c\" (UID: \"9442f819-8959-4c6c-8373-385d5b15ac7c\") " Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.970739 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-utilities" (OuterVolumeSpecName: "utilities") pod "9442f819-8959-4c6c-8373-385d5b15ac7c" (UID: "9442f819-8959-4c6c-8373-385d5b15ac7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:28:10 crc kubenswrapper[4834]: I1126 12:28:10.974517 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9442f819-8959-4c6c-8373-385d5b15ac7c-kube-api-access-m5pjp" (OuterVolumeSpecName: "kube-api-access-m5pjp") pod "9442f819-8959-4c6c-8373-385d5b15ac7c" (UID: "9442f819-8959-4c6c-8373-385d5b15ac7c"). InnerVolumeSpecName "kube-api-access-m5pjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.006250 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9442f819-8959-4c6c-8373-385d5b15ac7c" (UID: "9442f819-8959-4c6c-8373-385d5b15ac7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.072676 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.072704 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5pjp\" (UniqueName: \"kubernetes.io/projected/9442f819-8959-4c6c-8373-385d5b15ac7c-kube-api-access-m5pjp\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.072718 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9442f819-8959-4c6c-8373-385d5b15ac7c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.513184 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.559911 4834 generic.go:334] "Generic (PLEG): container finished" podID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerID="c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11" exitCode=0 Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.559973 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-22644" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.559989 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22644" event={"ID":"9442f819-8959-4c6c-8373-385d5b15ac7c","Type":"ContainerDied","Data":"c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11"} Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.560513 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22644" event={"ID":"9442f819-8959-4c6c-8373-385d5b15ac7c","Type":"ContainerDied","Data":"2325cfb7c9246b72755933f8bfb5942fc543f556525341461b81fc87f3d4f4be"} Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.560555 4834 scope.go:117] "RemoveContainer" containerID="c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.582797 4834 scope.go:117] "RemoveContainer" containerID="f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.591505 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-22644"] Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.598666 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-22644"] Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.613852 4834 scope.go:117] "RemoveContainer" containerID="a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.626670 4834 scope.go:117] "RemoveContainer" containerID="c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11" Nov 26 12:28:11 crc kubenswrapper[4834]: E1126 12:28:11.626927 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11\": container with ID starting with c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11 not found: ID does not exist" containerID="c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.626955 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11"} err="failed to get container status \"c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11\": rpc error: code = NotFound desc = could not find container \"c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11\": container with ID starting with c867e0a92e0dd09246809a605509b1450d2a490216f6efdfb1f487cbdbe22d11 not found: ID does not exist" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.626974 4834 scope.go:117] "RemoveContainer" containerID="f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd" Nov 26 12:28:11 crc kubenswrapper[4834]: E1126 12:28:11.627274 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd\": container with ID starting with f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd not found: ID does not exist" containerID="f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.627329 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd"} err="failed to get container status \"f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd\": rpc error: code = NotFound desc = could not find container \"f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd\": container with ID starting with f6843b6e240de2668ab334d047cabd64e9cc7d46483926e9892bb19f2a5a68dd not found: ID does not exist" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.627344 4834 scope.go:117] "RemoveContainer" containerID="a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096" Nov 26 12:28:11 crc kubenswrapper[4834]: E1126 12:28:11.627613 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096\": container with ID starting with a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096 not found: ID does not exist" containerID="a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096" Nov 26 12:28:11 crc kubenswrapper[4834]: I1126 12:28:11.627637 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096"} err="failed to get container status \"a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096\": rpc error: code = NotFound desc = could not find container \"a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096\": container with ID starting with a00b980251b211353a9c62ba565cb87ecec5e1708bd3c7fcb9a440f1a61bf096 not found: ID does not exist" Nov 26 12:28:12 crc kubenswrapper[4834]: I1126 12:28:12.425737 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9442f819-8959-4c6c-8373-385d5b15ac7c" path="/var/lib/kubelet/pods/9442f819-8959-4c6c-8373-385d5b15ac7c/volumes" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.115289 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.115678 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8" containerName="kube-state-metrics" containerID="cri-o://5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b" gracePeriod=30 Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.136611 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.159291 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.492088 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.575567 4834 generic.go:334] "Generic (PLEG): container finished" podID="fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8" containerID="5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b" exitCode=2 Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.575618 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.575654 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8","Type":"ContainerDied","Data":"5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b"} Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.575692 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8","Type":"ContainerDied","Data":"98e7852bca5a3091f41a9904fb88501161dcc52d7741319c20ab81893865e18c"} Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.575710 4834 scope.go:117] "RemoveContainer" containerID="5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.595021 4834 scope.go:117] "RemoveContainer" containerID="5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b" Nov 26 12:28:13 crc kubenswrapper[4834]: E1126 12:28:13.595295 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b\": container with ID starting with 5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b not found: ID does not exist" containerID="5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.595381 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b"} err="failed to get container status \"5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b\": rpc error: code = NotFound desc = could not find container \"5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b\": container with ID starting with 5c59d28b2d3d097539daf3d21953f6a4b59f6b1aeed137a0153ec8d4be42560b not found: ID does not exist" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.601185 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.627805 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jj7p\" (UniqueName: \"kubernetes.io/projected/fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8-kube-api-access-2jj7p\") pod \"fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8\" (UID: \"fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8\") " Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.636733 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8-kube-api-access-2jj7p" (OuterVolumeSpecName: "kube-api-access-2jj7p") pod "fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8" (UID: "fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8"). InnerVolumeSpecName "kube-api-access-2jj7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.730048 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jj7p\" (UniqueName: \"kubernetes.io/projected/fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8-kube-api-access-2jj7p\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.900397 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.905343 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.917136 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 12:28:13 crc kubenswrapper[4834]: E1126 12:28:13.917432 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerName="extract-content" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.917448 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerName="extract-content" Nov 26 12:28:13 crc kubenswrapper[4834]: E1126 12:28:13.917458 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8" containerName="kube-state-metrics" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.917464 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8" containerName="kube-state-metrics" Nov 26 12:28:13 crc kubenswrapper[4834]: E1126 12:28:13.917477 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerName="extract-utilities" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.917483 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerName="extract-utilities" Nov 26 12:28:13 crc kubenswrapper[4834]: E1126 12:28:13.917513 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerName="registry-server" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.917518 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerName="registry-server" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.917649 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9442f819-8959-4c6c-8373-385d5b15ac7c" containerName="registry-server" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.917665 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8" containerName="kube-state-metrics" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.918174 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.919906 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.920166 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 26 12:28:13 crc kubenswrapper[4834]: I1126 12:28:13.932281 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.003042 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.003500 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="ceilometer-central-agent" containerID="cri-o://ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b" gracePeriod=30 Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.003587 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="ceilometer-notification-agent" containerID="cri-o://7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d" gracePeriod=30 Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.003592 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="sg-core" containerID="cri-o://7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0" gracePeriod=30 Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.003542 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="proxy-httpd" containerID="cri-o://246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001" gracePeriod=30 Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.034168 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be94615-ce61-41ff-a0e1-5fe4851d42ea-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.034223 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4be94615-ce61-41ff-a0e1-5fe4851d42ea-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.034252 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be94615-ce61-41ff-a0e1-5fe4851d42ea-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.034277 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckghx\" (UniqueName: \"kubernetes.io/projected/4be94615-ce61-41ff-a0e1-5fe4851d42ea-kube-api-access-ckghx\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.136226 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be94615-ce61-41ff-a0e1-5fe4851d42ea-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.136341 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4be94615-ce61-41ff-a0e1-5fe4851d42ea-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.136388 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be94615-ce61-41ff-a0e1-5fe4851d42ea-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.136438 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckghx\" (UniqueName: \"kubernetes.io/projected/4be94615-ce61-41ff-a0e1-5fe4851d42ea-kube-api-access-ckghx\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.140448 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be94615-ce61-41ff-a0e1-5fe4851d42ea-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.140479 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4be94615-ce61-41ff-a0e1-5fe4851d42ea-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.140600 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be94615-ce61-41ff-a0e1-5fe4851d42ea-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.151439 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckghx\" (UniqueName: \"kubernetes.io/projected/4be94615-ce61-41ff-a0e1-5fe4851d42ea-kube-api-access-ckghx\") pod \"kube-state-metrics-0\" (UID: \"4be94615-ce61-41ff-a0e1-5fe4851d42ea\") " pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.234640 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.427216 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8" path="/var/lib/kubelet/pods/fa0c9cee-52ad-4c9e-9dd2-6c211df2cdc8/volumes" Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.584913 4834 generic.go:334] "Generic (PLEG): container finished" podID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerID="246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001" exitCode=0 Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.584938 4834 generic.go:334] "Generic (PLEG): container finished" podID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerID="7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0" exitCode=2 Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.584945 4834 generic.go:334] "Generic (PLEG): container finished" podID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerID="ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b" exitCode=0 Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.584996 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerDied","Data":"246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001"} Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.585046 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerDied","Data":"7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0"} Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.585056 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerDied","Data":"ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b"} Nov 26 12:28:14 crc kubenswrapper[4834]: I1126 12:28:14.607003 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 12:28:15 crc kubenswrapper[4834]: I1126 12:28:15.592410 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4be94615-ce61-41ff-a0e1-5fe4851d42ea","Type":"ContainerStarted","Data":"695b413e8f3ecc2253cda1aaa71e7ae1fa058740eabab16edc22dffa95515fb3"} Nov 26 12:28:15 crc kubenswrapper[4834]: I1126 12:28:15.593476 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 26 12:28:15 crc kubenswrapper[4834]: I1126 12:28:15.593501 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4be94615-ce61-41ff-a0e1-5fe4851d42ea","Type":"ContainerStarted","Data":"df3d9984cbddf51b32bead440c3de7968fbf2883640f6e7282b41b7819850d75"} Nov 26 12:28:15 crc kubenswrapper[4834]: I1126 12:28:15.607752 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.356751871 podStartE2EDuration="2.607738486s" podCreationTimestamp="2025-11-26 12:28:13 +0000 UTC" firstStartedPulling="2025-11-26 12:28:14.615900659 +0000 UTC m=+992.523114012" lastFinishedPulling="2025-11-26 12:28:14.866887275 +0000 UTC m=+992.774100627" observedRunningTime="2025-11-26 12:28:15.603150146 +0000 UTC m=+993.510363498" watchObservedRunningTime="2025-11-26 12:28:15.607738486 +0000 UTC m=+993.514951828" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.158188 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.162253 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.163028 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.165799 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.167964 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.168241 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.168339 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.170357 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.219710 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-config-data\") pod \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.219820 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-log-httpd\") pod \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.219854 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-sg-core-conf-yaml\") pod \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.219884 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5sdx\" (UniqueName: \"kubernetes.io/projected/34aac5bd-c148-4b6b-a39f-cdbed6aca483-kube-api-access-n5sdx\") pod \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.219924 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-run-httpd\") pod \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.219959 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-combined-ca-bundle\") pod \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.220008 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-scripts\") pod \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\" (UID: \"34aac5bd-c148-4b6b-a39f-cdbed6aca483\") " Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.223742 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "34aac5bd-c148-4b6b-a39f-cdbed6aca483" (UID: "34aac5bd-c148-4b6b-a39f-cdbed6aca483"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.225051 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "34aac5bd-c148-4b6b-a39f-cdbed6aca483" (UID: "34aac5bd-c148-4b6b-a39f-cdbed6aca483"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.225447 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-scripts" (OuterVolumeSpecName: "scripts") pod "34aac5bd-c148-4b6b-a39f-cdbed6aca483" (UID: "34aac5bd-c148-4b6b-a39f-cdbed6aca483"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.229304 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34aac5bd-c148-4b6b-a39f-cdbed6aca483-kube-api-access-n5sdx" (OuterVolumeSpecName: "kube-api-access-n5sdx") pod "34aac5bd-c148-4b6b-a39f-cdbed6aca483" (UID: "34aac5bd-c148-4b6b-a39f-cdbed6aca483"). InnerVolumeSpecName "kube-api-access-n5sdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.248960 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "34aac5bd-c148-4b6b-a39f-cdbed6aca483" (UID: "34aac5bd-c148-4b6b-a39f-cdbed6aca483"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.276672 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34aac5bd-c148-4b6b-a39f-cdbed6aca483" (UID: "34aac5bd-c148-4b6b-a39f-cdbed6aca483"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.297093 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-config-data" (OuterVolumeSpecName: "config-data") pod "34aac5bd-c148-4b6b-a39f-cdbed6aca483" (UID: "34aac5bd-c148-4b6b-a39f-cdbed6aca483"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.321628 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.321650 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.321661 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5sdx\" (UniqueName: \"kubernetes.io/projected/34aac5bd-c148-4b6b-a39f-cdbed6aca483-kube-api-access-n5sdx\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.321668 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34aac5bd-c148-4b6b-a39f-cdbed6aca483-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.321676 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.321687 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.321695 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34aac5bd-c148-4b6b-a39f-cdbed6aca483-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.638384 4834 generic.go:334] "Generic (PLEG): container finished" podID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerID="7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d" exitCode=0 Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.638440 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.638463 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerDied","Data":"7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d"} Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.638764 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34aac5bd-c148-4b6b-a39f-cdbed6aca483","Type":"ContainerDied","Data":"80786b9bbd0400d5431097bd571294574dee6f69590796d32713ee8fbc880690"} Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.638785 4834 scope.go:117] "RemoveContainer" containerID="246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.639517 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.642111 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.646502 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.654129 4834 scope.go:117] "RemoveContainer" containerID="7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.682019 4834 scope.go:117] "RemoveContainer" containerID="7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.684940 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.709957 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.738183 4834 scope.go:117] "RemoveContainer" containerID="ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.760074 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:19 crc kubenswrapper[4834]: E1126 12:28:19.760446 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="sg-core" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.760465 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="sg-core" Nov 26 12:28:19 crc kubenswrapper[4834]: E1126 12:28:19.760481 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="proxy-httpd" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.760487 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="proxy-httpd" Nov 26 12:28:19 crc kubenswrapper[4834]: E1126 12:28:19.760508 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="ceilometer-notification-agent" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.760513 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="ceilometer-notification-agent" Nov 26 12:28:19 crc kubenswrapper[4834]: E1126 12:28:19.760534 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="ceilometer-central-agent" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.760540 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="ceilometer-central-agent" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.760675 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="proxy-httpd" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.760696 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="ceilometer-notification-agent" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.760710 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="sg-core" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.760723 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" containerName="ceilometer-central-agent" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.762155 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.766404 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.766659 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.766778 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.808152 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.815136 4834 scope.go:117] "RemoveContainer" containerID="246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001" Nov 26 12:28:19 crc kubenswrapper[4834]: E1126 12:28:19.821899 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001\": container with ID starting with 246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001 not found: ID does not exist" containerID="246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.821945 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001"} err="failed to get container status \"246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001\": rpc error: code = NotFound desc = could not find container \"246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001\": container with ID starting with 246729a89e6e27895cef0526a64eb35e49155d169d7f85680e813dafb5629001 not found: ID does not exist" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.821976 4834 scope.go:117] "RemoveContainer" containerID="7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0" Nov 26 12:28:19 crc kubenswrapper[4834]: E1126 12:28:19.822547 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0\": container with ID starting with 7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0 not found: ID does not exist" containerID="7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.822569 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0"} err="failed to get container status \"7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0\": rpc error: code = NotFound desc = could not find container \"7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0\": container with ID starting with 7bd193c39906ba007dae689be6055f6ab2528c9c9427f19071d925ac2f75d3a0 not found: ID does not exist" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.822583 4834 scope.go:117] "RemoveContainer" containerID="7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d" Nov 26 12:28:19 crc kubenswrapper[4834]: E1126 12:28:19.825721 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d\": container with ID starting with 7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d not found: ID does not exist" containerID="7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.825747 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d"} err="failed to get container status \"7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d\": rpc error: code = NotFound desc = could not find container \"7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d\": container with ID starting with 7ba88b6cf55a26a2f8288972816f748e979ab02f29769d295a5a849cc922450d not found: ID does not exist" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.825762 4834 scope.go:117] "RemoveContainer" containerID="ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b" Nov 26 12:28:19 crc kubenswrapper[4834]: E1126 12:28:19.831443 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b\": container with ID starting with ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b not found: ID does not exist" containerID="ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.831471 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b"} err="failed to get container status \"ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b\": rpc error: code = NotFound desc = could not find container \"ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b\": container with ID starting with ac78658cec3a8fc70c3a7e73457e0e170395f52ec3d46332784d89dceac7578b not found: ID does not exist" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.838380 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.838449 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.838495 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-config-data\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.838510 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-scripts\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.838544 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srvfz\" (UniqueName: \"kubernetes.io/projected/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-kube-api-access-srvfz\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.838576 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-run-httpd\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.838617 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.838644 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-log-httpd\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.878295 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f95c456cf-mwjr5"] Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.896463 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f95c456cf-mwjr5"] Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.896555 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942425 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srvfz\" (UniqueName: \"kubernetes.io/projected/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-kube-api-access-srvfz\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942494 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-run-httpd\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942576 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znq2h\" (UniqueName: \"kubernetes.io/projected/99a2b0e3-787c-4555-9592-9ad76eeedc7e-kube-api-access-znq2h\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942596 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-sb\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942624 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942675 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-log-httpd\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942735 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942792 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-nb\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942832 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942896 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-config-data\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942916 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-scripts\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942942 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-config\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.942965 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-dns-svc\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.944291 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-log-httpd\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.944590 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-run-httpd\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.952585 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-scripts\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.969027 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.969237 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-config-data\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.972070 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.973752 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:19 crc kubenswrapper[4834]: I1126 12:28:19.978870 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srvfz\" (UniqueName: \"kubernetes.io/projected/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-kube-api-access-srvfz\") pod \"ceilometer-0\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " pod="openstack/ceilometer-0" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.044482 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-nb\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.044567 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-config\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.044590 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-dns-svc\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.044651 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znq2h\" (UniqueName: \"kubernetes.io/projected/99a2b0e3-787c-4555-9592-9ad76eeedc7e-kube-api-access-znq2h\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.044669 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-sb\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.045389 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-sb\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.045853 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-nb\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.046352 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-config\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.047081 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-dns-svc\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.059746 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znq2h\" (UniqueName: \"kubernetes.io/projected/99a2b0e3-787c-4555-9592-9ad76eeedc7e-kube-api-access-znq2h\") pod \"dnsmasq-dns-f95c456cf-mwjr5\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.103917 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.304244 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.437289 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34aac5bd-c148-4b6b-a39f-cdbed6aca483" path="/var/lib/kubelet/pods/34aac5bd-c148-4b6b-a39f-cdbed6aca483/volumes" Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.513376 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.646855 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerStarted","Data":"138c87c5b656a64bfc0baa2d75f6940fcf856790670d593890df7fd5b29af036"} Nov 26 12:28:20 crc kubenswrapper[4834]: I1126 12:28:20.760932 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f95c456cf-mwjr5"] Nov 26 12:28:20 crc kubenswrapper[4834]: W1126 12:28:20.766036 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99a2b0e3_787c_4555_9592_9ad76eeedc7e.slice/crio-ab1eddd489dc1f7f823f5d211a6866e68c3a985215e079cd30a1120884648f3a WatchSource:0}: Error finding container ab1eddd489dc1f7f823f5d211a6866e68c3a985215e079cd30a1120884648f3a: Status 404 returned error can't find the container with id ab1eddd489dc1f7f823f5d211a6866e68c3a985215e079cd30a1120884648f3a Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.183034 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.651461 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.654906 4834 generic.go:334] "Generic (PLEG): container finished" podID="99a2b0e3-787c-4555-9592-9ad76eeedc7e" containerID="c1a217ed1cc95559d2b354fb883f6617944de533db0bf19f912d1fb2545338bd" exitCode=0 Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.654967 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" event={"ID":"99a2b0e3-787c-4555-9592-9ad76eeedc7e","Type":"ContainerDied","Data":"c1a217ed1cc95559d2b354fb883f6617944de533db0bf19f912d1fb2545338bd"} Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.655005 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" event={"ID":"99a2b0e3-787c-4555-9592-9ad76eeedc7e","Type":"ContainerStarted","Data":"ab1eddd489dc1f7f823f5d211a6866e68c3a985215e079cd30a1120884648f3a"} Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.657038 4834 generic.go:334] "Generic (PLEG): container finished" podID="fde4d925-7270-403a-b047-9533a8a61c3c" containerID="f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc" exitCode=137 Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.657062 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.657102 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fde4d925-7270-403a-b047-9533a8a61c3c","Type":"ContainerDied","Data":"f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc"} Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.657121 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fde4d925-7270-403a-b047-9533a8a61c3c","Type":"ContainerDied","Data":"7caba0e77c5377a82c37188ae0d11b8237a34c4f481aa5fbddeb6580ccb507c6"} Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.657135 4834 scope.go:117] "RemoveContainer" containerID="f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.716714 4834 scope.go:117] "RemoveContainer" containerID="f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc" Nov 26 12:28:21 crc kubenswrapper[4834]: E1126 12:28:21.717348 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc\": container with ID starting with f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc not found: ID does not exist" containerID="f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.717411 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc"} err="failed to get container status \"f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc\": rpc error: code = NotFound desc = could not find container \"f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc\": container with ID starting with f4353c966dac5c2073ff1dfe5181a159c1100a8759412be2b3a9c18adf5196cc not found: ID does not exist" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.753512 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.773181 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-combined-ca-bundle\") pod \"fde4d925-7270-403a-b047-9533a8a61c3c\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.773410 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7wt2\" (UniqueName: \"kubernetes.io/projected/fde4d925-7270-403a-b047-9533a8a61c3c-kube-api-access-z7wt2\") pod \"fde4d925-7270-403a-b047-9533a8a61c3c\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.773478 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-config-data\") pod \"fde4d925-7270-403a-b047-9533a8a61c3c\" (UID: \"fde4d925-7270-403a-b047-9533a8a61c3c\") " Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.779393 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde4d925-7270-403a-b047-9533a8a61c3c-kube-api-access-z7wt2" (OuterVolumeSpecName: "kube-api-access-z7wt2") pod "fde4d925-7270-403a-b047-9533a8a61c3c" (UID: "fde4d925-7270-403a-b047-9533a8a61c3c"). InnerVolumeSpecName "kube-api-access-z7wt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.797021 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fde4d925-7270-403a-b047-9533a8a61c3c" (UID: "fde4d925-7270-403a-b047-9533a8a61c3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.803035 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-config-data" (OuterVolumeSpecName: "config-data") pod "fde4d925-7270-403a-b047-9533a8a61c3c" (UID: "fde4d925-7270-403a-b047-9533a8a61c3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.876264 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.876294 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7wt2\" (UniqueName: \"kubernetes.io/projected/fde4d925-7270-403a-b047-9533a8a61c3c-kube-api-access-z7wt2\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.876317 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde4d925-7270-403a-b047-9533a8a61c3c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.983115 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 12:28:21 crc kubenswrapper[4834]: I1126 12:28:21.999672 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.007372 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 12:28:22 crc kubenswrapper[4834]: E1126 12:28:22.007809 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde4d925-7270-403a-b047-9533a8a61c3c" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.007827 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde4d925-7270-403a-b047-9533a8a61c3c" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.007968 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde4d925-7270-403a-b047-9533a8a61c3c" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.008599 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.011557 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.011736 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.012280 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.018854 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.087214 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.087277 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.087299 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.087363 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.087419 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsrw\" (UniqueName: \"kubernetes.io/projected/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-kube-api-access-nqsrw\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.188846 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.189346 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsrw\" (UniqueName: \"kubernetes.io/projected/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-kube-api-access-nqsrw\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.189479 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.189554 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.189572 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.206233 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.214743 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.214937 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.215760 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.224769 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsrw\" (UniqueName: \"kubernetes.io/projected/ff344ee7-ddb0-4286-bb4b-dd1b27e8c710-kube-api-access-nqsrw\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.324417 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.425825 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fde4d925-7270-403a-b047-9533a8a61c3c" path="/var/lib/kubelet/pods/fde4d925-7270-403a-b047-9533a8a61c3c/volumes" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.665871 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerStarted","Data":"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e"} Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.670297 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" event={"ID":"99a2b0e3-787c-4555-9592-9ad76eeedc7e","Type":"ContainerStarted","Data":"5621be9dfa21086b1a61bc8893995cd7d88c61eb94cf980cb393cbadd0e95612"} Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.670489 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.674583 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-log" containerID="cri-o://1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687" gracePeriod=30 Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.674637 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-api" containerID="cri-o://661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd" gracePeriod=30 Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.697607 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" podStartSLOduration=3.697593941 podStartE2EDuration="3.697593941s" podCreationTimestamp="2025-11-26 12:28:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:22.692727047 +0000 UTC m=+1000.599940400" watchObservedRunningTime="2025-11-26 12:28:22.697593941 +0000 UTC m=+1000.604807294" Nov 26 12:28:22 crc kubenswrapper[4834]: I1126 12:28:22.730741 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 12:28:22 crc kubenswrapper[4834]: W1126 12:28:22.733719 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff344ee7_ddb0_4286_bb4b_dd1b27e8c710.slice/crio-a6f357a51f911df8a5e38dd18a10df04f9bd3faf5adf300bac3949d3ac8dc946 WatchSource:0}: Error finding container a6f357a51f911df8a5e38dd18a10df04f9bd3faf5adf300bac3949d3ac8dc946: Status 404 returned error can't find the container with id a6f357a51f911df8a5e38dd18a10df04f9bd3faf5adf300bac3949d3ac8dc946 Nov 26 12:28:23 crc kubenswrapper[4834]: I1126 12:28:23.682840 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerStarted","Data":"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0"} Nov 26 12:28:23 crc kubenswrapper[4834]: I1126 12:28:23.691153 4834 generic.go:334] "Generic (PLEG): container finished" podID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerID="1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687" exitCode=143 Nov 26 12:28:23 crc kubenswrapper[4834]: I1126 12:28:23.691210 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b0483aad-c574-4d04-bf6a-7b30f49b4214","Type":"ContainerDied","Data":"1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687"} Nov 26 12:28:23 crc kubenswrapper[4834]: I1126 12:28:23.695321 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710","Type":"ContainerStarted","Data":"f8003384dbf8b46a11dfda65890e49ed04a254bd6fdbf981f222f1e7a82db16f"} Nov 26 12:28:23 crc kubenswrapper[4834]: I1126 12:28:23.695360 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ff344ee7-ddb0-4286-bb4b-dd1b27e8c710","Type":"ContainerStarted","Data":"a6f357a51f911df8a5e38dd18a10df04f9bd3faf5adf300bac3949d3ac8dc946"} Nov 26 12:28:23 crc kubenswrapper[4834]: I1126 12:28:23.713781 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.713769073 podStartE2EDuration="2.713769073s" podCreationTimestamp="2025-11-26 12:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:23.706082314 +0000 UTC m=+1001.613295666" watchObservedRunningTime="2025-11-26 12:28:23.713769073 +0000 UTC m=+1001.620982425" Nov 26 12:28:24 crc kubenswrapper[4834]: I1126 12:28:24.242259 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 26 12:28:24 crc kubenswrapper[4834]: I1126 12:28:24.701856 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerStarted","Data":"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6"} Nov 26 12:28:25 crc kubenswrapper[4834]: I1126 12:28:25.710539 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerStarted","Data":"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb"} Nov 26 12:28:25 crc kubenswrapper[4834]: I1126 12:28:25.710918 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="ceilometer-central-agent" containerID="cri-o://d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e" gracePeriod=30 Nov 26 12:28:25 crc kubenswrapper[4834]: I1126 12:28:25.711179 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 12:28:25 crc kubenswrapper[4834]: I1126 12:28:25.711452 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="proxy-httpd" containerID="cri-o://701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb" gracePeriod=30 Nov 26 12:28:25 crc kubenswrapper[4834]: I1126 12:28:25.711500 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="sg-core" containerID="cri-o://c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6" gracePeriod=30 Nov 26 12:28:25 crc kubenswrapper[4834]: I1126 12:28:25.711537 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="ceilometer-notification-agent" containerID="cri-o://27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0" gracePeriod=30 Nov 26 12:28:25 crc kubenswrapper[4834]: I1126 12:28:25.731328 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.097508801 podStartE2EDuration="6.731298669s" podCreationTimestamp="2025-11-26 12:28:19 +0000 UTC" firstStartedPulling="2025-11-26 12:28:20.522386396 +0000 UTC m=+998.429599748" lastFinishedPulling="2025-11-26 12:28:25.156176263 +0000 UTC m=+1003.063389616" observedRunningTime="2025-11-26 12:28:25.72728355 +0000 UTC m=+1003.634496902" watchObservedRunningTime="2025-11-26 12:28:25.731298669 +0000 UTC m=+1003.638512021" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.114624 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.172511 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh9p8\" (UniqueName: \"kubernetes.io/projected/b0483aad-c574-4d04-bf6a-7b30f49b4214-kube-api-access-qh9p8\") pod \"b0483aad-c574-4d04-bf6a-7b30f49b4214\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.172568 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-config-data\") pod \"b0483aad-c574-4d04-bf6a-7b30f49b4214\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.172740 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0483aad-c574-4d04-bf6a-7b30f49b4214-logs\") pod \"b0483aad-c574-4d04-bf6a-7b30f49b4214\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.172765 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-combined-ca-bundle\") pod \"b0483aad-c574-4d04-bf6a-7b30f49b4214\" (UID: \"b0483aad-c574-4d04-bf6a-7b30f49b4214\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.173188 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0483aad-c574-4d04-bf6a-7b30f49b4214-logs" (OuterVolumeSpecName: "logs") pod "b0483aad-c574-4d04-bf6a-7b30f49b4214" (UID: "b0483aad-c574-4d04-bf6a-7b30f49b4214"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.178505 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0483aad-c574-4d04-bf6a-7b30f49b4214-kube-api-access-qh9p8" (OuterVolumeSpecName: "kube-api-access-qh9p8") pod "b0483aad-c574-4d04-bf6a-7b30f49b4214" (UID: "b0483aad-c574-4d04-bf6a-7b30f49b4214"). InnerVolumeSpecName "kube-api-access-qh9p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.198398 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0483aad-c574-4d04-bf6a-7b30f49b4214" (UID: "b0483aad-c574-4d04-bf6a-7b30f49b4214"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.198438 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-config-data" (OuterVolumeSpecName: "config-data") pod "b0483aad-c574-4d04-bf6a-7b30f49b4214" (UID: "b0483aad-c574-4d04-bf6a-7b30f49b4214"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.274815 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh9p8\" (UniqueName: \"kubernetes.io/projected/b0483aad-c574-4d04-bf6a-7b30f49b4214-kube-api-access-qh9p8\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.274843 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.274854 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0483aad-c574-4d04-bf6a-7b30f49b4214-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.274863 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0483aad-c574-4d04-bf6a-7b30f49b4214-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.668950 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720750 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerID="701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb" exitCode=0 Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720777 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerID="c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6" exitCode=2 Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720787 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerID="27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0" exitCode=0 Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720783 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerDied","Data":"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb"} Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720806 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720820 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerDied","Data":"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6"} Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720832 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerDied","Data":"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0"} Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720840 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerDied","Data":"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e"} Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720857 4834 scope.go:117] "RemoveContainer" containerID="701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720795 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerID="d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e" exitCode=0 Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.720908 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e","Type":"ContainerDied","Data":"138c87c5b656a64bfc0baa2d75f6940fcf856790670d593890df7fd5b29af036"} Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.722734 4834 generic.go:334] "Generic (PLEG): container finished" podID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerID="661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd" exitCode=0 Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.722759 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b0483aad-c574-4d04-bf6a-7b30f49b4214","Type":"ContainerDied","Data":"661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd"} Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.722779 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.722783 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b0483aad-c574-4d04-bf6a-7b30f49b4214","Type":"ContainerDied","Data":"7171ad256948a4c277a27f0947b5b274fcc49eef7999561f70481dc3debb30dd"} Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.737684 4834 scope.go:117] "RemoveContainer" containerID="c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.740462 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.753661 4834 scope.go:117] "RemoveContainer" containerID="27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.756303 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.765590 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.765931 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-log" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.765948 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-log" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.765982 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-api" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.765989 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-api" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.766004 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="proxy-httpd" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766009 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="proxy-httpd" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.766022 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="sg-core" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766027 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="sg-core" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.766042 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="ceilometer-notification-agent" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766049 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="ceilometer-notification-agent" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.766057 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="ceilometer-central-agent" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766062 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="ceilometer-central-agent" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766240 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="sg-core" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766254 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="proxy-httpd" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766266 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-log" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766277 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="ceilometer-central-agent" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766287 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" containerName="ceilometer-notification-agent" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.766295 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" containerName="nova-api-api" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.767182 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.769125 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.769431 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.769569 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.772651 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.775975 4834 scope.go:117] "RemoveContainer" containerID="d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.787902 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-run-httpd\") pod \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.787973 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-sg-core-conf-yaml\") pod \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.788062 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-combined-ca-bundle\") pod \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.788123 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-scripts\") pod \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.788145 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-log-httpd\") pod \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.788170 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srvfz\" (UniqueName: \"kubernetes.io/projected/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-kube-api-access-srvfz\") pod \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.788294 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-config-data\") pod \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.788344 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-ceilometer-tls-certs\") pod \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\" (UID: \"c7d42ab8-a8a2-4856-96b3-be04fdd07b3e\") " Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.790473 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" (UID: "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.790816 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" (UID: "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.793002 4834 scope.go:117] "RemoveContainer" containerID="701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.793489 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb\": container with ID starting with 701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb not found: ID does not exist" containerID="701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.793515 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb"} err="failed to get container status \"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb\": rpc error: code = NotFound desc = could not find container \"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb\": container with ID starting with 701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.793538 4834 scope.go:117] "RemoveContainer" containerID="c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.793705 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-kube-api-access-srvfz" (OuterVolumeSpecName: "kube-api-access-srvfz") pod "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" (UID: "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e"). InnerVolumeSpecName "kube-api-access-srvfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.793935 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6\": container with ID starting with c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6 not found: ID does not exist" containerID="c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.794168 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6"} err="failed to get container status \"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6\": rpc error: code = NotFound desc = could not find container \"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6\": container with ID starting with c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6 not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.794197 4834 scope.go:117] "RemoveContainer" containerID="27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.794550 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0\": container with ID starting with 27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0 not found: ID does not exist" containerID="27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.794573 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0"} err="failed to get container status \"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0\": rpc error: code = NotFound desc = could not find container \"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0\": container with ID starting with 27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0 not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.794721 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-scripts" (OuterVolumeSpecName: "scripts") pod "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" (UID: "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.795210 4834 scope.go:117] "RemoveContainer" containerID="d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.795567 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e\": container with ID starting with d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e not found: ID does not exist" containerID="d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.795600 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e"} err="failed to get container status \"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e\": rpc error: code = NotFound desc = could not find container \"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e\": container with ID starting with d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.795624 4834 scope.go:117] "RemoveContainer" containerID="701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.796076 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb"} err="failed to get container status \"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb\": rpc error: code = NotFound desc = could not find container \"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb\": container with ID starting with 701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.796095 4834 scope.go:117] "RemoveContainer" containerID="c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.796322 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6"} err="failed to get container status \"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6\": rpc error: code = NotFound desc = could not find container \"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6\": container with ID starting with c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6 not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.796342 4834 scope.go:117] "RemoveContainer" containerID="27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.796607 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0"} err="failed to get container status \"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0\": rpc error: code = NotFound desc = could not find container \"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0\": container with ID starting with 27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0 not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.796631 4834 scope.go:117] "RemoveContainer" containerID="d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.796830 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e"} err="failed to get container status \"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e\": rpc error: code = NotFound desc = could not find container \"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e\": container with ID starting with d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.796850 4834 scope.go:117] "RemoveContainer" containerID="701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.797210 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb"} err="failed to get container status \"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb\": rpc error: code = NotFound desc = could not find container \"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb\": container with ID starting with 701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.797230 4834 scope.go:117] "RemoveContainer" containerID="c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.797488 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6"} err="failed to get container status \"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6\": rpc error: code = NotFound desc = could not find container \"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6\": container with ID starting with c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6 not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.797514 4834 scope.go:117] "RemoveContainer" containerID="27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.797717 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0"} err="failed to get container status \"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0\": rpc error: code = NotFound desc = could not find container \"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0\": container with ID starting with 27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0 not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.797738 4834 scope.go:117] "RemoveContainer" containerID="d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.798153 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e"} err="failed to get container status \"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e\": rpc error: code = NotFound desc = could not find container \"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e\": container with ID starting with d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.798175 4834 scope.go:117] "RemoveContainer" containerID="701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.798455 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb"} err="failed to get container status \"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb\": rpc error: code = NotFound desc = could not find container \"701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb\": container with ID starting with 701b2e834fd83775b2129fcdc98edcad75b3fb8b91d0d460a92609310959addb not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.798474 4834 scope.go:117] "RemoveContainer" containerID="c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.798841 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6"} err="failed to get container status \"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6\": rpc error: code = NotFound desc = could not find container \"c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6\": container with ID starting with c1d9e3748b75906c14fd5caa4bd177c630284a69466d3283d74f667b37374ff6 not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.798861 4834 scope.go:117] "RemoveContainer" containerID="27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.799111 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0"} err="failed to get container status \"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0\": rpc error: code = NotFound desc = could not find container \"27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0\": container with ID starting with 27b370657547d3205c6fa50440711c7819ecc041d0d661febbf25f76f233fcc0 not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.799128 4834 scope.go:117] "RemoveContainer" containerID="d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.799318 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e"} err="failed to get container status \"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e\": rpc error: code = NotFound desc = could not find container \"d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e\": container with ID starting with d3ec6a675f6eba46c2e0307c1d07e100fdd3fce31fa30801582840fd51627d1e not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.799335 4834 scope.go:117] "RemoveContainer" containerID="661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.810202 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" (UID: "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.815937 4834 scope.go:117] "RemoveContainer" containerID="1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.827871 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" (UID: "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.837353 4834 scope.go:117] "RemoveContainer" containerID="661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.840275 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd\": container with ID starting with 661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd not found: ID does not exist" containerID="661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.840411 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd"} err="failed to get container status \"661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd\": rpc error: code = NotFound desc = could not find container \"661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd\": container with ID starting with 661984699a26c4e3acd916212acd8b441ba410c77c7e7e65638a4adf62441efd not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.840442 4834 scope.go:117] "RemoveContainer" containerID="1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687" Nov 26 12:28:26 crc kubenswrapper[4834]: E1126 12:28:26.840755 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687\": container with ID starting with 1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687 not found: ID does not exist" containerID="1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.840786 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687"} err="failed to get container status \"1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687\": rpc error: code = NotFound desc = could not find container \"1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687\": container with ID starting with 1f91ef7292c0d6f8ef9b8cef0b37b44848bff5fed8b7fe6cc7ab7bd46b40c687 not found: ID does not exist" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.854573 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" (UID: "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.861707 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-config-data" (OuterVolumeSpecName: "config-data") pod "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" (UID: "c7d42ab8-a8a2-4856-96b3-be04fdd07b3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893190 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-public-tls-certs\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893268 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddphv\" (UniqueName: \"kubernetes.io/projected/01f41a95-80d8-4d31-99ca-94188e385917-kube-api-access-ddphv\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893383 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f41a95-80d8-4d31-99ca-94188e385917-logs\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893506 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-internal-tls-certs\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893542 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-config-data\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893613 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893702 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893718 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893729 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893744 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893752 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893760 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srvfz\" (UniqueName: \"kubernetes.io/projected/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-kube-api-access-srvfz\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893769 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.893776 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.995436 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddphv\" (UniqueName: \"kubernetes.io/projected/01f41a95-80d8-4d31-99ca-94188e385917-kube-api-access-ddphv\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.995488 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f41a95-80d8-4d31-99ca-94188e385917-logs\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.995548 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-internal-tls-certs\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.995567 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-config-data\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.995927 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.995871 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f41a95-80d8-4d31-99ca-94188e385917-logs\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.996286 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-public-tls-certs\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.998242 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-internal-tls-certs\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.998703 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-config-data\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.998713 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:26 crc kubenswrapper[4834]: I1126 12:28:26.998761 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-public-tls-certs\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.009295 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddphv\" (UniqueName: \"kubernetes.io/projected/01f41a95-80d8-4d31-99ca-94188e385917-kube-api-access-ddphv\") pod \"nova-api-0\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " pod="openstack/nova-api-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.051449 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.057712 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.070087 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.072224 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.073981 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.074142 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.074389 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.079447 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.081401 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.200700 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.200890 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.200907 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-scripts\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.200942 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.200974 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-config-data\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.201014 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-run-httpd\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.201118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-log-httpd\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.201179 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzcz8\" (UniqueName: \"kubernetes.io/projected/0f3d377f-7b47-4237-8ea4-c697d52f30c8-kube-api-access-nzcz8\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.302879 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.302929 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-scripts\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.302987 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.303010 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-config-data\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.303054 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-run-httpd\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.303091 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-log-httpd\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.303112 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzcz8\" (UniqueName: \"kubernetes.io/projected/0f3d377f-7b47-4237-8ea4-c697d52f30c8-kube-api-access-nzcz8\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.303178 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.303564 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-run-httpd\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.303608 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-log-httpd\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.308210 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.308228 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.308267 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.309541 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-config-data\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.309583 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-scripts\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.322894 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzcz8\" (UniqueName: \"kubernetes.io/projected/0f3d377f-7b47-4237-8ea4-c697d52f30c8-kube-api-access-nzcz8\") pod \"ceilometer-0\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.325618 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.390626 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.462898 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:27 crc kubenswrapper[4834]: W1126 12:28:27.479343 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f41a95_80d8_4d31_99ca_94188e385917.slice/crio-c3e20374f82d18f2cc44d590a88b903cc9fb60a7fd4091eedbe169522db17c97 WatchSource:0}: Error finding container c3e20374f82d18f2cc44d590a88b903cc9fb60a7fd4091eedbe169522db17c97: Status 404 returned error can't find the container with id c3e20374f82d18f2cc44d590a88b903cc9fb60a7fd4091eedbe169522db17c97 Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.742042 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01f41a95-80d8-4d31-99ca-94188e385917","Type":"ContainerStarted","Data":"1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59"} Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.742090 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01f41a95-80d8-4d31-99ca-94188e385917","Type":"ContainerStarted","Data":"5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1"} Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.742101 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01f41a95-80d8-4d31-99ca-94188e385917","Type":"ContainerStarted","Data":"c3e20374f82d18f2cc44d590a88b903cc9fb60a7fd4091eedbe169522db17c97"} Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.766375 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.766360904 podStartE2EDuration="1.766360904s" podCreationTimestamp="2025-11-26 12:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:27.759132298 +0000 UTC m=+1005.666345650" watchObservedRunningTime="2025-11-26 12:28:27.766360904 +0000 UTC m=+1005.673574255" Nov 26 12:28:27 crc kubenswrapper[4834]: I1126 12:28:27.782438 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:28:27 crc kubenswrapper[4834]: W1126 12:28:27.783901 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f3d377f_7b47_4237_8ea4_c697d52f30c8.slice/crio-b4b5e9057ebc9d766279f0e7a6ae9aff8df054022d02f67a23abdd7819a08241 WatchSource:0}: Error finding container b4b5e9057ebc9d766279f0e7a6ae9aff8df054022d02f67a23abdd7819a08241: Status 404 returned error can't find the container with id b4b5e9057ebc9d766279f0e7a6ae9aff8df054022d02f67a23abdd7819a08241 Nov 26 12:28:28 crc kubenswrapper[4834]: I1126 12:28:28.426383 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0483aad-c574-4d04-bf6a-7b30f49b4214" path="/var/lib/kubelet/pods/b0483aad-c574-4d04-bf6a-7b30f49b4214/volumes" Nov 26 12:28:28 crc kubenswrapper[4834]: I1126 12:28:28.427236 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d42ab8-a8a2-4856-96b3-be04fdd07b3e" path="/var/lib/kubelet/pods/c7d42ab8-a8a2-4856-96b3-be04fdd07b3e/volumes" Nov 26 12:28:28 crc kubenswrapper[4834]: I1126 12:28:28.759428 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerStarted","Data":"46c93440e4ac0eb96b1e7fe0093787ca8eb13cb9ee2b3c51ad9fd8ef58547ac4"} Nov 26 12:28:28 crc kubenswrapper[4834]: I1126 12:28:28.759659 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerStarted","Data":"b4b5e9057ebc9d766279f0e7a6ae9aff8df054022d02f67a23abdd7819a08241"} Nov 26 12:28:29 crc kubenswrapper[4834]: I1126 12:28:29.767611 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerStarted","Data":"eef99731cc48444f1922b330426628292b03d4929e0c4a3ee6c56eaa91d023df"} Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.305475 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.359526 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f7bbc55bc-zj66h"] Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.359745 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" podUID="5599c395-4375-4e68-bcc3-448e61f2ee1d" containerName="dnsmasq-dns" containerID="cri-o://12f67a3e666e156e51fd52e8017a07107066c703e0a539b1fbca75516d66a312" gracePeriod=10 Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.778176 4834 generic.go:334] "Generic (PLEG): container finished" podID="5599c395-4375-4e68-bcc3-448e61f2ee1d" containerID="12f67a3e666e156e51fd52e8017a07107066c703e0a539b1fbca75516d66a312" exitCode=0 Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.778335 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" event={"ID":"5599c395-4375-4e68-bcc3-448e61f2ee1d","Type":"ContainerDied","Data":"12f67a3e666e156e51fd52e8017a07107066c703e0a539b1fbca75516d66a312"} Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.778460 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" event={"ID":"5599c395-4375-4e68-bcc3-448e61f2ee1d","Type":"ContainerDied","Data":"b3c1667d91bbf90535a131b80b044d9dc9c29427726af4ca01aa1b6664c0ac6b"} Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.778479 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3c1667d91bbf90535a131b80b044d9dc9c29427726af4ca01aa1b6664c0ac6b" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.801791 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.863793 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-nb\") pod \"5599c395-4375-4e68-bcc3-448e61f2ee1d\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.863972 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-config\") pod \"5599c395-4375-4e68-bcc3-448e61f2ee1d\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.864001 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-dns-svc\") pod \"5599c395-4375-4e68-bcc3-448e61f2ee1d\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.864048 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djdcr\" (UniqueName: \"kubernetes.io/projected/5599c395-4375-4e68-bcc3-448e61f2ee1d-kube-api-access-djdcr\") pod \"5599c395-4375-4e68-bcc3-448e61f2ee1d\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.864091 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-sb\") pod \"5599c395-4375-4e68-bcc3-448e61f2ee1d\" (UID: \"5599c395-4375-4e68-bcc3-448e61f2ee1d\") " Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.868847 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5599c395-4375-4e68-bcc3-448e61f2ee1d-kube-api-access-djdcr" (OuterVolumeSpecName: "kube-api-access-djdcr") pod "5599c395-4375-4e68-bcc3-448e61f2ee1d" (UID: "5599c395-4375-4e68-bcc3-448e61f2ee1d"). InnerVolumeSpecName "kube-api-access-djdcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.904636 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-config" (OuterVolumeSpecName: "config") pod "5599c395-4375-4e68-bcc3-448e61f2ee1d" (UID: "5599c395-4375-4e68-bcc3-448e61f2ee1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.906098 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5599c395-4375-4e68-bcc3-448e61f2ee1d" (UID: "5599c395-4375-4e68-bcc3-448e61f2ee1d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.906402 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5599c395-4375-4e68-bcc3-448e61f2ee1d" (UID: "5599c395-4375-4e68-bcc3-448e61f2ee1d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.907837 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5599c395-4375-4e68-bcc3-448e61f2ee1d" (UID: "5599c395-4375-4e68-bcc3-448e61f2ee1d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.966286 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.966331 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.966342 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djdcr\" (UniqueName: \"kubernetes.io/projected/5599c395-4375-4e68-bcc3-448e61f2ee1d-kube-api-access-djdcr\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.966354 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:30 crc kubenswrapper[4834]: I1126 12:28:30.966364 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5599c395-4375-4e68-bcc3-448e61f2ee1d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:31 crc kubenswrapper[4834]: I1126 12:28:31.787408 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7bbc55bc-zj66h" Nov 26 12:28:31 crc kubenswrapper[4834]: I1126 12:28:31.788275 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerStarted","Data":"c8277eb9332b889ead7b3e2b767467cbce54b54d72cf35b6ded93da975bc0ac6"} Nov 26 12:28:31 crc kubenswrapper[4834]: I1126 12:28:31.816870 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f7bbc55bc-zj66h"] Nov 26 12:28:31 crc kubenswrapper[4834]: I1126 12:28:31.824904 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f7bbc55bc-zj66h"] Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.326013 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.348683 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.430003 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5599c395-4375-4e68-bcc3-448e61f2ee1d" path="/var/lib/kubelet/pods/5599c395-4375-4e68-bcc3-448e61f2ee1d/volumes" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.798417 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerStarted","Data":"2fe67af5044e399a1232bfd4f6fb6c9de728cd2e85844e9740d0ac3a6740c439"} Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.816775 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.172941941 podStartE2EDuration="5.816762597s" podCreationTimestamp="2025-11-26 12:28:27 +0000 UTC" firstStartedPulling="2025-11-26 12:28:27.78733521 +0000 UTC m=+1005.694548562" lastFinishedPulling="2025-11-26 12:28:32.431155866 +0000 UTC m=+1010.338369218" observedRunningTime="2025-11-26 12:28:32.812852738 +0000 UTC m=+1010.720066090" watchObservedRunningTime="2025-11-26 12:28:32.816762597 +0000 UTC m=+1010.723975949" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.821299 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.946022 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4r524"] Nov 26 12:28:32 crc kubenswrapper[4834]: E1126 12:28:32.946655 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599c395-4375-4e68-bcc3-448e61f2ee1d" containerName="dnsmasq-dns" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.946674 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599c395-4375-4e68-bcc3-448e61f2ee1d" containerName="dnsmasq-dns" Nov 26 12:28:32 crc kubenswrapper[4834]: E1126 12:28:32.946701 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599c395-4375-4e68-bcc3-448e61f2ee1d" containerName="init" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.946707 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599c395-4375-4e68-bcc3-448e61f2ee1d" containerName="init" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.946852 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5599c395-4375-4e68-bcc3-448e61f2ee1d" containerName="dnsmasq-dns" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.947530 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.950836 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.951083 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 26 12:28:32 crc kubenswrapper[4834]: I1126 12:28:32.954073 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4r524"] Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.006068 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h75zk\" (UniqueName: \"kubernetes.io/projected/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-kube-api-access-h75zk\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.006130 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.006183 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-scripts\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.006205 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-config-data\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.108258 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h75zk\" (UniqueName: \"kubernetes.io/projected/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-kube-api-access-h75zk\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.108334 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.108385 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-scripts\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.108408 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-config-data\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.112358 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-config-data\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.112850 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.120904 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h75zk\" (UniqueName: \"kubernetes.io/projected/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-kube-api-access-h75zk\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.126100 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-scripts\") pod \"nova-cell1-cell-mapping-4r524\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.267528 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.640550 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4r524"] Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.806709 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4r524" event={"ID":"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18","Type":"ContainerStarted","Data":"d0bf51023cd36dfc45d75298ac07e00b44d3a300b84f9e4e57ff7314d3655740"} Nov 26 12:28:33 crc kubenswrapper[4834]: I1126 12:28:33.807564 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 12:28:34 crc kubenswrapper[4834]: I1126 12:28:34.818084 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4r524" event={"ID":"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18","Type":"ContainerStarted","Data":"4842915b89f15ec8a244c43ba00fb3d189cd6a6d355104f223c4f056b47ea8f3"} Nov 26 12:28:34 crc kubenswrapper[4834]: I1126 12:28:34.839979 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4r524" podStartSLOduration=2.839952843 podStartE2EDuration="2.839952843s" podCreationTimestamp="2025-11-26 12:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:34.833338125 +0000 UTC m=+1012.740551477" watchObservedRunningTime="2025-11-26 12:28:34.839952843 +0000 UTC m=+1012.747166195" Nov 26 12:28:37 crc kubenswrapper[4834]: I1126 12:28:37.080644 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 12:28:37 crc kubenswrapper[4834]: I1126 12:28:37.081014 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 12:28:37 crc kubenswrapper[4834]: I1126 12:28:37.838506 4834 generic.go:334] "Generic (PLEG): container finished" podID="7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18" containerID="4842915b89f15ec8a244c43ba00fb3d189cd6a6d355104f223c4f056b47ea8f3" exitCode=0 Nov 26 12:28:37 crc kubenswrapper[4834]: I1126 12:28:37.838545 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4r524" event={"ID":"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18","Type":"ContainerDied","Data":"4842915b89f15ec8a244c43ba00fb3d189cd6a6d355104f223c4f056b47ea8f3"} Nov 26 12:28:38 crc kubenswrapper[4834]: I1126 12:28:38.095441 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:38 crc kubenswrapper[4834]: I1126 12:28:38.095474 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.137896 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.222978 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h75zk\" (UniqueName: \"kubernetes.io/projected/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-kube-api-access-h75zk\") pod \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.223025 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-combined-ca-bundle\") pod \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.223071 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-config-data\") pod \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.223210 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-scripts\") pod \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\" (UID: \"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18\") " Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.228346 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-scripts" (OuterVolumeSpecName: "scripts") pod "7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18" (UID: "7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.228884 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-kube-api-access-h75zk" (OuterVolumeSpecName: "kube-api-access-h75zk") pod "7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18" (UID: "7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18"). InnerVolumeSpecName "kube-api-access-h75zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.243999 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-config-data" (OuterVolumeSpecName: "config-data") pod "7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18" (UID: "7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.244475 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18" (UID: "7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.327345 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.327389 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.327446 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h75zk\" (UniqueName: \"kubernetes.io/projected/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-kube-api-access-h75zk\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.327463 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.861742 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4r524" event={"ID":"7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18","Type":"ContainerDied","Data":"d0bf51023cd36dfc45d75298ac07e00b44d3a300b84f9e4e57ff7314d3655740"} Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.861981 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0bf51023cd36dfc45d75298ac07e00b44d3a300b84f9e4e57ff7314d3655740" Nov 26 12:28:39 crc kubenswrapper[4834]: I1126 12:28:39.861793 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4r524" Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.019031 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.019508 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-log" containerID="cri-o://5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1" gracePeriod=30 Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.019600 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-api" containerID="cri-o://1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59" gracePeriod=30 Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.034071 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.034299 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0fb89061-9816-4975-9e60-a3fafaf6d334" containerName="nova-scheduler-scheduler" containerID="cri-o://4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682" gracePeriod=30 Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.042096 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.042347 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-log" containerID="cri-o://d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb" gracePeriod=30 Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.042427 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-metadata" containerID="cri-o://d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e" gracePeriod=30 Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.871138 4834 generic.go:334] "Generic (PLEG): container finished" podID="01f41a95-80d8-4d31-99ca-94188e385917" containerID="5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1" exitCode=143 Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.871220 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01f41a95-80d8-4d31-99ca-94188e385917","Type":"ContainerDied","Data":"5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1"} Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.873989 4834 generic.go:334] "Generic (PLEG): container finished" podID="6faa1535-1c01-410b-a1b0-a462683941f2" containerID="d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb" exitCode=143 Nov 26 12:28:40 crc kubenswrapper[4834]: I1126 12:28:40.874030 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6faa1535-1c01-410b-a1b0-a462683941f2","Type":"ContainerDied","Data":"d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb"} Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.319977 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.464224 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xzc9\" (UniqueName: \"kubernetes.io/projected/0fb89061-9816-4975-9e60-a3fafaf6d334-kube-api-access-9xzc9\") pod \"0fb89061-9816-4975-9e60-a3fafaf6d334\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.464371 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-combined-ca-bundle\") pod \"0fb89061-9816-4975-9e60-a3fafaf6d334\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.464431 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-config-data\") pod \"0fb89061-9816-4975-9e60-a3fafaf6d334\" (UID: \"0fb89061-9816-4975-9e60-a3fafaf6d334\") " Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.474674 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb89061-9816-4975-9e60-a3fafaf6d334-kube-api-access-9xzc9" (OuterVolumeSpecName: "kube-api-access-9xzc9") pod "0fb89061-9816-4975-9e60-a3fafaf6d334" (UID: "0fb89061-9816-4975-9e60-a3fafaf6d334"). InnerVolumeSpecName "kube-api-access-9xzc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.488863 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-config-data" (OuterVolumeSpecName: "config-data") pod "0fb89061-9816-4975-9e60-a3fafaf6d334" (UID: "0fb89061-9816-4975-9e60-a3fafaf6d334"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.494144 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fb89061-9816-4975-9e60-a3fafaf6d334" (UID: "0fb89061-9816-4975-9e60-a3fafaf6d334"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.566911 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xzc9\" (UniqueName: \"kubernetes.io/projected/0fb89061-9816-4975-9e60-a3fafaf6d334-kube-api-access-9xzc9\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.567016 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.567081 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb89061-9816-4975-9e60-a3fafaf6d334-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.888827 4834 generic.go:334] "Generic (PLEG): container finished" podID="0fb89061-9816-4975-9e60-a3fafaf6d334" containerID="4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682" exitCode=0 Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.888884 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.888928 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0fb89061-9816-4975-9e60-a3fafaf6d334","Type":"ContainerDied","Data":"4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682"} Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.889354 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0fb89061-9816-4975-9e60-a3fafaf6d334","Type":"ContainerDied","Data":"b3be5d978a403b1838f937dd7f267a5c9c8187a8dac0919ac09dc8c916eb7727"} Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.889378 4834 scope.go:117] "RemoveContainer" containerID="4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.906127 4834 scope.go:117] "RemoveContainer" containerID="4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682" Nov 26 12:28:41 crc kubenswrapper[4834]: E1126 12:28:41.906448 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682\": container with ID starting with 4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682 not found: ID does not exist" containerID="4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.906482 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682"} err="failed to get container status \"4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682\": rpc error: code = NotFound desc = could not find container \"4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682\": container with ID starting with 4156cd4b50cfab44997afdcc20ae466bc97ccc6f3cfb37533ed45a5c0b771682 not found: ID does not exist" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.914109 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.919856 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.925140 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:41 crc kubenswrapper[4834]: E1126 12:28:41.925633 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18" containerName="nova-manage" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.925661 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18" containerName="nova-manage" Nov 26 12:28:41 crc kubenswrapper[4834]: E1126 12:28:41.925687 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb89061-9816-4975-9e60-a3fafaf6d334" containerName="nova-scheduler-scheduler" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.925694 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb89061-9816-4975-9e60-a3fafaf6d334" containerName="nova-scheduler-scheduler" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.925841 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18" containerName="nova-manage" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.925861 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb89061-9816-4975-9e60-a3fafaf6d334" containerName="nova-scheduler-scheduler" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.926406 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.929552 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 26 12:28:41 crc kubenswrapper[4834]: I1126 12:28:41.939702 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.076066 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp47h\" (UniqueName: \"kubernetes.io/projected/774b42be-ec58-4678-b226-e590f1367ed2-kube-api-access-hp47h\") pod \"nova-scheduler-0\" (UID: \"774b42be-ec58-4678-b226-e590f1367ed2\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.076131 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774b42be-ec58-4678-b226-e590f1367ed2-config-data\") pod \"nova-scheduler-0\" (UID: \"774b42be-ec58-4678-b226-e590f1367ed2\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.076196 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774b42be-ec58-4678-b226-e590f1367ed2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"774b42be-ec58-4678-b226-e590f1367ed2\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.178608 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774b42be-ec58-4678-b226-e590f1367ed2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"774b42be-ec58-4678-b226-e590f1367ed2\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.178757 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp47h\" (UniqueName: \"kubernetes.io/projected/774b42be-ec58-4678-b226-e590f1367ed2-kube-api-access-hp47h\") pod \"nova-scheduler-0\" (UID: \"774b42be-ec58-4678-b226-e590f1367ed2\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.178789 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774b42be-ec58-4678-b226-e590f1367ed2-config-data\") pod \"nova-scheduler-0\" (UID: \"774b42be-ec58-4678-b226-e590f1367ed2\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.182680 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774b42be-ec58-4678-b226-e590f1367ed2-config-data\") pod \"nova-scheduler-0\" (UID: \"774b42be-ec58-4678-b226-e590f1367ed2\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.183061 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774b42be-ec58-4678-b226-e590f1367ed2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"774b42be-ec58-4678-b226-e590f1367ed2\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.194914 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp47h\" (UniqueName: \"kubernetes.io/projected/774b42be-ec58-4678-b226-e590f1367ed2-kube-api-access-hp47h\") pod \"nova-scheduler-0\" (UID: \"774b42be-ec58-4678-b226-e590f1367ed2\") " pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.245190 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.427407 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb89061-9816-4975-9e60-a3fafaf6d334" path="/var/lib/kubelet/pods/0fb89061-9816-4975-9e60-a3fafaf6d334/volumes" Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.633896 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.898362 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"774b42be-ec58-4678-b226-e590f1367ed2","Type":"ContainerStarted","Data":"1448d220a4bdb82f75087583a091a6804c9797ca66516b340d063ca5085cb81e"} Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.898411 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"774b42be-ec58-4678-b226-e590f1367ed2","Type":"ContainerStarted","Data":"7f413ad16bb10be5c97802f0b22f084a942913fa7ad20e92f52fc4100ae7363f"} Nov 26 12:28:42 crc kubenswrapper[4834]: I1126 12:28:42.918013 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.917996806 podStartE2EDuration="1.917996806s" podCreationTimestamp="2025-11-26 12:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:42.909774498 +0000 UTC m=+1020.816987850" watchObservedRunningTime="2025-11-26 12:28:42.917996806 +0000 UTC m=+1020.825210148" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.482536 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.550241 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.610342 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddphv\" (UniqueName: \"kubernetes.io/projected/01f41a95-80d8-4d31-99ca-94188e385917-kube-api-access-ddphv\") pod \"01f41a95-80d8-4d31-99ca-94188e385917\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.610386 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6faa1535-1c01-410b-a1b0-a462683941f2-logs\") pod \"6faa1535-1c01-410b-a1b0-a462683941f2\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.610502 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-public-tls-certs\") pod \"01f41a95-80d8-4d31-99ca-94188e385917\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.610558 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f41a95-80d8-4d31-99ca-94188e385917-logs\") pod \"01f41a95-80d8-4d31-99ca-94188e385917\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.610623 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hznm8\" (UniqueName: \"kubernetes.io/projected/6faa1535-1c01-410b-a1b0-a462683941f2-kube-api-access-hznm8\") pod \"6faa1535-1c01-410b-a1b0-a462683941f2\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.611015 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-nova-metadata-tls-certs\") pod \"6faa1535-1c01-410b-a1b0-a462683941f2\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.611322 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-config-data\") pod \"01f41a95-80d8-4d31-99ca-94188e385917\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.611456 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-combined-ca-bundle\") pod \"01f41a95-80d8-4d31-99ca-94188e385917\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.611528 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-internal-tls-certs\") pod \"01f41a95-80d8-4d31-99ca-94188e385917\" (UID: \"01f41a95-80d8-4d31-99ca-94188e385917\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.611553 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-config-data\") pod \"6faa1535-1c01-410b-a1b0-a462683941f2\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.611579 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-combined-ca-bundle\") pod \"6faa1535-1c01-410b-a1b0-a462683941f2\" (UID: \"6faa1535-1c01-410b-a1b0-a462683941f2\") " Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.610837 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6faa1535-1c01-410b-a1b0-a462683941f2-logs" (OuterVolumeSpecName: "logs") pod "6faa1535-1c01-410b-a1b0-a462683941f2" (UID: "6faa1535-1c01-410b-a1b0-a462683941f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.610950 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f41a95-80d8-4d31-99ca-94188e385917-logs" (OuterVolumeSpecName: "logs") pod "01f41a95-80d8-4d31-99ca-94188e385917" (UID: "01f41a95-80d8-4d31-99ca-94188e385917"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.616267 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f41a95-80d8-4d31-99ca-94188e385917-kube-api-access-ddphv" (OuterVolumeSpecName: "kube-api-access-ddphv") pod "01f41a95-80d8-4d31-99ca-94188e385917" (UID: "01f41a95-80d8-4d31-99ca-94188e385917"). InnerVolumeSpecName "kube-api-access-ddphv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.617678 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6faa1535-1c01-410b-a1b0-a462683941f2-kube-api-access-hznm8" (OuterVolumeSpecName: "kube-api-access-hznm8") pod "6faa1535-1c01-410b-a1b0-a462683941f2" (UID: "6faa1535-1c01-410b-a1b0-a462683941f2"). InnerVolumeSpecName "kube-api-access-hznm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.634981 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6faa1535-1c01-410b-a1b0-a462683941f2" (UID: "6faa1535-1c01-410b-a1b0-a462683941f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.641321 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-config-data" (OuterVolumeSpecName: "config-data") pod "6faa1535-1c01-410b-a1b0-a462683941f2" (UID: "6faa1535-1c01-410b-a1b0-a462683941f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.643406 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01f41a95-80d8-4d31-99ca-94188e385917" (UID: "01f41a95-80d8-4d31-99ca-94188e385917"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.643509 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-config-data" (OuterVolumeSpecName: "config-data") pod "01f41a95-80d8-4d31-99ca-94188e385917" (UID: "01f41a95-80d8-4d31-99ca-94188e385917"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.651725 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6faa1535-1c01-410b-a1b0-a462683941f2" (UID: "6faa1535-1c01-410b-a1b0-a462683941f2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.654755 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "01f41a95-80d8-4d31-99ca-94188e385917" (UID: "01f41a95-80d8-4d31-99ca-94188e385917"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.669813 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "01f41a95-80d8-4d31-99ca-94188e385917" (UID: "01f41a95-80d8-4d31-99ca-94188e385917"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.716231 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f41a95-80d8-4d31-99ca-94188e385917-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.716302 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hznm8\" (UniqueName: \"kubernetes.io/projected/6faa1535-1c01-410b-a1b0-a462683941f2-kube-api-access-hznm8\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.716329 4834 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.718651 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.718672 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.718684 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.718692 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.718700 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6faa1535-1c01-410b-a1b0-a462683941f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.718709 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddphv\" (UniqueName: \"kubernetes.io/projected/01f41a95-80d8-4d31-99ca-94188e385917-kube-api-access-ddphv\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.718716 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6faa1535-1c01-410b-a1b0-a462683941f2-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.718723 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f41a95-80d8-4d31-99ca-94188e385917-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.911729 4834 generic.go:334] "Generic (PLEG): container finished" podID="01f41a95-80d8-4d31-99ca-94188e385917" containerID="1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59" exitCode=0 Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.911779 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.911795 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01f41a95-80d8-4d31-99ca-94188e385917","Type":"ContainerDied","Data":"1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59"} Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.911824 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01f41a95-80d8-4d31-99ca-94188e385917","Type":"ContainerDied","Data":"c3e20374f82d18f2cc44d590a88b903cc9fb60a7fd4091eedbe169522db17c97"} Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.911842 4834 scope.go:117] "RemoveContainer" containerID="1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.916072 4834 generic.go:334] "Generic (PLEG): container finished" podID="6faa1535-1c01-410b-a1b0-a462683941f2" containerID="d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e" exitCode=0 Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.916782 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.916927 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6faa1535-1c01-410b-a1b0-a462683941f2","Type":"ContainerDied","Data":"d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e"} Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.916999 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6faa1535-1c01-410b-a1b0-a462683941f2","Type":"ContainerDied","Data":"77ed8021b071f206a0e7a9f7f62af358abd3533bb1537869d83c30b5d3d381f3"} Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.948151 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.964622 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.978018 4834 scope.go:117] "RemoveContainer" containerID="5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1" Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.985754 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:28:43 crc kubenswrapper[4834]: I1126 12:28:43.994386 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.013490 4834 scope.go:117] "RemoveContainer" containerID="1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59" Nov 26 12:28:44 crc kubenswrapper[4834]: E1126 12:28:44.017551 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59\": container with ID starting with 1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59 not found: ID does not exist" containerID="1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.017618 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59"} err="failed to get container status \"1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59\": rpc error: code = NotFound desc = could not find container \"1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59\": container with ID starting with 1f4a00929b66bdd02b9fc03957beba0eee3595e3bb02336ebf2fdc4900951c59 not found: ID does not exist" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.017666 4834 scope.go:117] "RemoveContainer" containerID="5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1" Nov 26 12:28:44 crc kubenswrapper[4834]: E1126 12:28:44.019546 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1\": container with ID starting with 5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1 not found: ID does not exist" containerID="5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.019587 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1"} err="failed to get container status \"5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1\": rpc error: code = NotFound desc = could not find container \"5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1\": container with ID starting with 5a17c661fbde654ccf5d5cf5e915868df661eb84a2df64483bf6426d9d0d4df1 not found: ID does not exist" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.019615 4834 scope.go:117] "RemoveContainer" containerID="d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.022701 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:44 crc kubenswrapper[4834]: E1126 12:28:44.023201 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-log" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.023221 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-log" Nov 26 12:28:44 crc kubenswrapper[4834]: E1126 12:28:44.023231 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-log" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.023237 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-log" Nov 26 12:28:44 crc kubenswrapper[4834]: E1126 12:28:44.023264 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-api" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.023270 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-api" Nov 26 12:28:44 crc kubenswrapper[4834]: E1126 12:28:44.023285 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-metadata" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.023291 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-metadata" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.023568 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-metadata" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.023593 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" containerName="nova-metadata-log" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.023605 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-api" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.023622 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f41a95-80d8-4d31-99ca-94188e385917" containerName="nova-api-log" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.024563 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.027169 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.027491 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.027626 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.033362 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.040480 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.055135 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.059156 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.061247 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.061530 4834 scope.go:117] "RemoveContainer" containerID="d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.064515 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.088829 4834 scope.go:117] "RemoveContainer" containerID="d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e" Nov 26 12:28:44 crc kubenswrapper[4834]: E1126 12:28:44.089299 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e\": container with ID starting with d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e not found: ID does not exist" containerID="d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.089368 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e"} err="failed to get container status \"d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e\": rpc error: code = NotFound desc = could not find container \"d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e\": container with ID starting with d05688c1d51a05b5625c6ea12f67992f48d177fc4089a7d9bf966395d3f7887e not found: ID does not exist" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.089400 4834 scope.go:117] "RemoveContainer" containerID="d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb" Nov 26 12:28:44 crc kubenswrapper[4834]: E1126 12:28:44.089735 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb\": container with ID starting with d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb not found: ID does not exist" containerID="d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.089775 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb"} err="failed to get container status \"d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb\": rpc error: code = NotFound desc = could not find container \"d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb\": container with ID starting with d8e6821cc2ad2ae229184d5b7227b4843f37ac7ccf063e15bd7429bf6a8237cb not found: ID does not exist" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.128620 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f1d827-26b6-46e8-8d2a-0559e7883478-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.128671 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.128820 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-config-data\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.128898 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-logs\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.128931 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c24l6\" (UniqueName: \"kubernetes.io/projected/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-kube-api-access-c24l6\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.129111 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f1d827-26b6-46e8-8d2a-0559e7883478-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.129151 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f1d827-26b6-46e8-8d2a-0559e7883478-config-data\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.129226 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-public-tls-certs\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.129273 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2hj\" (UniqueName: \"kubernetes.io/projected/56f1d827-26b6-46e8-8d2a-0559e7883478-kube-api-access-ww2hj\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.129373 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f1d827-26b6-46e8-8d2a-0559e7883478-logs\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.129418 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.231377 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-public-tls-certs\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.231447 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2hj\" (UniqueName: \"kubernetes.io/projected/56f1d827-26b6-46e8-8d2a-0559e7883478-kube-api-access-ww2hj\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.231489 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f1d827-26b6-46e8-8d2a-0559e7883478-logs\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.231523 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.231588 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f1d827-26b6-46e8-8d2a-0559e7883478-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.231611 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.231671 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-config-data\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.232413 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f1d827-26b6-46e8-8d2a-0559e7883478-logs\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.233225 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-logs\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.233264 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c24l6\" (UniqueName: \"kubernetes.io/projected/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-kube-api-access-c24l6\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.233416 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f1d827-26b6-46e8-8d2a-0559e7883478-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.233449 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f1d827-26b6-46e8-8d2a-0559e7883478-config-data\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.234054 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-logs\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.235622 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.236197 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-public-tls-certs\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.236382 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-config-data\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.238769 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f1d827-26b6-46e8-8d2a-0559e7883478-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.239194 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f1d827-26b6-46e8-8d2a-0559e7883478-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.245434 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.246243 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f1d827-26b6-46e8-8d2a-0559e7883478-config-data\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.250827 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c24l6\" (UniqueName: \"kubernetes.io/projected/9a65c5e1-ffb4-429f-a58b-2e79a51acc6a-kube-api-access-c24l6\") pod \"nova-api-0\" (UID: \"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a\") " pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.250950 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2hj\" (UniqueName: \"kubernetes.io/projected/56f1d827-26b6-46e8-8d2a-0559e7883478-kube-api-access-ww2hj\") pod \"nova-metadata-0\" (UID: \"56f1d827-26b6-46e8-8d2a-0559e7883478\") " pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.360988 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.377849 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.450941 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f41a95-80d8-4d31-99ca-94188e385917" path="/var/lib/kubelet/pods/01f41a95-80d8-4d31-99ca-94188e385917/volumes" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.452453 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6faa1535-1c01-410b-a1b0-a462683941f2" path="/var/lib/kubelet/pods/6faa1535-1c01-410b-a1b0-a462683941f2/volumes" Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.762565 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 12:28:44 crc kubenswrapper[4834]: W1126 12:28:44.769422 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a65c5e1_ffb4_429f_a58b_2e79a51acc6a.slice/crio-02d1b91d4ffaec3b4fb38f2a337eb4da4ee1721749232e4b8103a0e1b2ed3111 WatchSource:0}: Error finding container 02d1b91d4ffaec3b4fb38f2a337eb4da4ee1721749232e4b8103a0e1b2ed3111: Status 404 returned error can't find the container with id 02d1b91d4ffaec3b4fb38f2a337eb4da4ee1721749232e4b8103a0e1b2ed3111 Nov 26 12:28:44 crc kubenswrapper[4834]: W1126 12:28:44.829543 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56f1d827_26b6_46e8_8d2a_0559e7883478.slice/crio-317d2f7d97e5a51dffc3f6ba6e1b1929171ee3e2a14f3d267d43371be8a5d094 WatchSource:0}: Error finding container 317d2f7d97e5a51dffc3f6ba6e1b1929171ee3e2a14f3d267d43371be8a5d094: Status 404 returned error can't find the container with id 317d2f7d97e5a51dffc3f6ba6e1b1929171ee3e2a14f3d267d43371be8a5d094 Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.834352 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.928056 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56f1d827-26b6-46e8-8d2a-0559e7883478","Type":"ContainerStarted","Data":"317d2f7d97e5a51dffc3f6ba6e1b1929171ee3e2a14f3d267d43371be8a5d094"} Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.930623 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a","Type":"ContainerStarted","Data":"c37c4a5c3c32277766f2f9fe867ce48619100953f10ba7d063a76a2041e35259"} Nov 26 12:28:44 crc kubenswrapper[4834]: I1126 12:28:44.930663 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a","Type":"ContainerStarted","Data":"02d1b91d4ffaec3b4fb38f2a337eb4da4ee1721749232e4b8103a0e1b2ed3111"} Nov 26 12:28:45 crc kubenswrapper[4834]: I1126 12:28:45.942066 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a65c5e1-ffb4-429f-a58b-2e79a51acc6a","Type":"ContainerStarted","Data":"fdb599031204328f05f4537f7a8c07bd0b717c19d884ab2a4eac5f3e8fd306a9"} Nov 26 12:28:45 crc kubenswrapper[4834]: I1126 12:28:45.943626 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56f1d827-26b6-46e8-8d2a-0559e7883478","Type":"ContainerStarted","Data":"ef559366b606e1a57dd8ffca5f2373bbe026a7def4b6a50e800b731186d24111"} Nov 26 12:28:45 crc kubenswrapper[4834]: I1126 12:28:45.943667 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56f1d827-26b6-46e8-8d2a-0559e7883478","Type":"ContainerStarted","Data":"82a4a0255a589efd226770b757005db9da4a753dff1c960eda09d00237d87be6"} Nov 26 12:28:45 crc kubenswrapper[4834]: I1126 12:28:45.958618 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9586055079999998 podStartE2EDuration="2.958605508s" podCreationTimestamp="2025-11-26 12:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:45.956735522 +0000 UTC m=+1023.863948874" watchObservedRunningTime="2025-11-26 12:28:45.958605508 +0000 UTC m=+1023.865818859" Nov 26 12:28:45 crc kubenswrapper[4834]: I1126 12:28:45.972205 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.972185174 podStartE2EDuration="2.972185174s" podCreationTimestamp="2025-11-26 12:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:28:45.97060708 +0000 UTC m=+1023.877820433" watchObservedRunningTime="2025-11-26 12:28:45.972185174 +0000 UTC m=+1023.879398527" Nov 26 12:28:47 crc kubenswrapper[4834]: I1126 12:28:47.245884 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 26 12:28:49 crc kubenswrapper[4834]: I1126 12:28:49.377941 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 12:28:49 crc kubenswrapper[4834]: I1126 12:28:49.377990 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 12:28:52 crc kubenswrapper[4834]: I1126 12:28:52.245987 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 12:28:52 crc kubenswrapper[4834]: I1126 12:28:52.267026 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 12:28:53 crc kubenswrapper[4834]: I1126 12:28:53.028835 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 12:28:54 crc kubenswrapper[4834]: I1126 12:28:54.362106 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 12:28:54 crc kubenswrapper[4834]: I1126 12:28:54.362388 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 12:28:54 crc kubenswrapper[4834]: I1126 12:28:54.378261 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 12:28:54 crc kubenswrapper[4834]: I1126 12:28:54.378304 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 12:28:55 crc kubenswrapper[4834]: I1126 12:28:55.376415 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9a65c5e1-ffb4-429f-a58b-2e79a51acc6a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:55 crc kubenswrapper[4834]: I1126 12:28:55.376669 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9a65c5e1-ffb4-429f-a58b-2e79a51acc6a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:55 crc kubenswrapper[4834]: I1126 12:28:55.387432 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="56f1d827-26b6-46e8-8d2a-0559e7883478" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:55 crc kubenswrapper[4834]: I1126 12:28:55.387434 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="56f1d827-26b6-46e8-8d2a-0559e7883478" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 12:28:57 crc kubenswrapper[4834]: I1126 12:28:57.399146 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 12:29:04 crc kubenswrapper[4834]: I1126 12:29:04.369538 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 12:29:04 crc kubenswrapper[4834]: I1126 12:29:04.370237 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 26 12:29:04 crc kubenswrapper[4834]: I1126 12:29:04.370963 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 12:29:04 crc kubenswrapper[4834]: I1126 12:29:04.375374 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 12:29:04 crc kubenswrapper[4834]: I1126 12:29:04.383741 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 12:29:04 crc kubenswrapper[4834]: I1126 12:29:04.387922 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 12:29:04 crc kubenswrapper[4834]: I1126 12:29:04.388099 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 12:29:05 crc kubenswrapper[4834]: I1126 12:29:05.092859 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 26 12:29:05 crc kubenswrapper[4834]: I1126 12:29:05.097654 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 12:29:05 crc kubenswrapper[4834]: I1126 12:29:05.098071 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 12:29:11 crc kubenswrapper[4834]: I1126 12:29:11.755737 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 12:29:12 crc kubenswrapper[4834]: I1126 12:29:12.323145 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 12:29:15 crc kubenswrapper[4834]: I1126 12:29:15.035472 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3f30c5fe-7895-474e-a94d-967b23650025" containerName="rabbitmq" containerID="cri-o://74d0607133594134910f3e5a2b2b986d1d3c7f8763f17093a89e8ac30861a21b" gracePeriod=604797 Nov 26 12:29:15 crc kubenswrapper[4834]: I1126 12:29:15.577010 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="25058ace-b8d8-4ac1-924a-844f3955f0fb" containerName="rabbitmq" containerID="cri-o://066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f" gracePeriod=604797 Nov 26 12:29:20 crc kubenswrapper[4834]: I1126 12:29:20.341712 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="25058ace-b8d8-4ac1-924a-844f3955f0fb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Nov 26 12:29:20 crc kubenswrapper[4834]: I1126 12:29:20.378480 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3f30c5fe-7895-474e-a94d-967b23650025" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.94:5671: connect: connection refused" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.195760 4834 generic.go:334] "Generic (PLEG): container finished" podID="3f30c5fe-7895-474e-a94d-967b23650025" containerID="74d0607133594134910f3e5a2b2b986d1d3c7f8763f17093a89e8ac30861a21b" exitCode=0 Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.195837 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f30c5fe-7895-474e-a94d-967b23650025","Type":"ContainerDied","Data":"74d0607133594134910f3e5a2b2b986d1d3c7f8763f17093a89e8ac30861a21b"} Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.397257 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.529365 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-plugins-conf\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.529467 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-plugins\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.529984 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530050 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f30c5fe-7895-474e-a94d-967b23650025-pod-info\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530122 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-server-conf\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530042 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530580 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-tls\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530637 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530678 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-config-data\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530698 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srwwd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-kube-api-access-srwwd\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530749 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-confd\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530779 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-erlang-cookie\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530773 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530825 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f30c5fe-7895-474e-a94d-967b23650025-erlang-cookie-secret\") pod \"3f30c5fe-7895-474e-a94d-967b23650025\" (UID: \"3f30c5fe-7895-474e-a94d-967b23650025\") " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.530824 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.531476 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.531491 4834 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.531539 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.535046 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3f30c5fe-7895-474e-a94d-967b23650025-pod-info" (OuterVolumeSpecName: "pod-info") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.535088 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-kube-api-access-srwwd" (OuterVolumeSpecName: "kube-api-access-srwwd") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "kube-api-access-srwwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.535211 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.552634 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f30c5fe-7895-474e-a94d-967b23650025-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.552742 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.553581 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-config-data" (OuterVolumeSpecName: "config-data") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.568691 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-server-conf" (OuterVolumeSpecName: "server-conf") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.607645 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3f30c5fe-7895-474e-a94d-967b23650025" (UID: "3f30c5fe-7895-474e-a94d-967b23650025"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.634007 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.634096 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.634108 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.634119 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srwwd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-kube-api-access-srwwd\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.634130 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.634139 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f30c5fe-7895-474e-a94d-967b23650025-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.634147 4834 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f30c5fe-7895-474e-a94d-967b23650025-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.634156 4834 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f30c5fe-7895-474e-a94d-967b23650025-pod-info\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.634166 4834 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f30c5fe-7895-474e-a94d-967b23650025-server-conf\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.660116 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.735663 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:21 crc kubenswrapper[4834]: I1126 12:29:21.891133 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.040722 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-plugins-conf\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.040847 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25058ace-b8d8-4ac1-924a-844f3955f0fb-pod-info\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.040869 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25058ace-b8d8-4ac1-924a-844f3955f0fb-erlang-cookie-secret\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.040888 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrvpl\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-kube-api-access-qrvpl\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.040901 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-tls\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.040987 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-erlang-cookie\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.041003 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.041020 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-confd\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.041086 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-config-data\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.041101 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-server-conf\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.041124 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-plugins\") pod \"25058ace-b8d8-4ac1-924a-844f3955f0fb\" (UID: \"25058ace-b8d8-4ac1-924a-844f3955f0fb\") " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.041702 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.041881 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.041978 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.042191 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.042212 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.042222 4834 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.044097 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/25058ace-b8d8-4ac1-924a-844f3955f0fb-pod-info" (OuterVolumeSpecName: "pod-info") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.044144 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-kube-api-access-qrvpl" (OuterVolumeSpecName: "kube-api-access-qrvpl") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "kube-api-access-qrvpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.044657 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.045583 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.045982 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25058ace-b8d8-4ac1-924a-844f3955f0fb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.075618 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-config-data" (OuterVolumeSpecName: "config-data") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.088772 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-server-conf" (OuterVolumeSpecName: "server-conf") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.111482 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "25058ace-b8d8-4ac1-924a-844f3955f0fb" (UID: "25058ace-b8d8-4ac1-924a-844f3955f0fb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.143691 4834 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25058ace-b8d8-4ac1-924a-844f3955f0fb-pod-info\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.143726 4834 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25058ace-b8d8-4ac1-924a-844f3955f0fb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.143740 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrvpl\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-kube-api-access-qrvpl\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.143749 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.143777 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.143786 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25058ace-b8d8-4ac1-924a-844f3955f0fb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.143794 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.143801 4834 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25058ace-b8d8-4ac1-924a-844f3955f0fb-server-conf\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.170601 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.205430 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f30c5fe-7895-474e-a94d-967b23650025","Type":"ContainerDied","Data":"34c7b5e3955204d18fbe411bec268d57d14a4e8b27a972381f33f454e07e3d24"} Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.205475 4834 scope.go:117] "RemoveContainer" containerID="74d0607133594134910f3e5a2b2b986d1d3c7f8763f17093a89e8ac30861a21b" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.205579 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.211111 4834 generic.go:334] "Generic (PLEG): container finished" podID="25058ace-b8d8-4ac1-924a-844f3955f0fb" containerID="066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f" exitCode=0 Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.211148 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"25058ace-b8d8-4ac1-924a-844f3955f0fb","Type":"ContainerDied","Data":"066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f"} Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.211173 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"25058ace-b8d8-4ac1-924a-844f3955f0fb","Type":"ContainerDied","Data":"828a01a0e55a04224a4fad80a60f718afb6028c4968d95acd3b1f055da0f4e9d"} Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.211223 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.245077 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.249449 4834 scope.go:117] "RemoveContainer" containerID="f2360140e760e2b0540dfc06873720e62ea3d3673fe4b602b6263cef14fad59a" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.260439 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.266734 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.273027 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.278972 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.290585 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 12:29:22 crc kubenswrapper[4834]: E1126 12:29:22.290959 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f30c5fe-7895-474e-a94d-967b23650025" containerName="rabbitmq" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.290971 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f30c5fe-7895-474e-a94d-967b23650025" containerName="rabbitmq" Nov 26 12:29:22 crc kubenswrapper[4834]: E1126 12:29:22.290986 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f30c5fe-7895-474e-a94d-967b23650025" containerName="setup-container" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.291000 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f30c5fe-7895-474e-a94d-967b23650025" containerName="setup-container" Nov 26 12:29:22 crc kubenswrapper[4834]: E1126 12:29:22.291035 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25058ace-b8d8-4ac1-924a-844f3955f0fb" containerName="rabbitmq" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.291049 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="25058ace-b8d8-4ac1-924a-844f3955f0fb" containerName="rabbitmq" Nov 26 12:29:22 crc kubenswrapper[4834]: E1126 12:29:22.291058 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25058ace-b8d8-4ac1-924a-844f3955f0fb" containerName="setup-container" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.291064 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="25058ace-b8d8-4ac1-924a-844f3955f0fb" containerName="setup-container" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.291202 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f30c5fe-7895-474e-a94d-967b23650025" containerName="rabbitmq" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.291213 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="25058ace-b8d8-4ac1-924a-844f3955f0fb" containerName="rabbitmq" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.292087 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.294565 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.294852 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.295059 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.295167 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.295281 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.295402 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.296738 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.298750 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.298926 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.298929 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.298997 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.299283 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-w6st6" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.299440 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.299604 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.303574 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.303815 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bnpxq" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.306297 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.308102 4834 scope.go:117] "RemoveContainer" containerID="066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.334541 4834 scope.go:117] "RemoveContainer" containerID="2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.342942 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.366477 4834 scope.go:117] "RemoveContainer" containerID="066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f" Nov 26 12:29:22 crc kubenswrapper[4834]: E1126 12:29:22.367232 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f\": container with ID starting with 066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f not found: ID does not exist" containerID="066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.367254 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f"} err="failed to get container status \"066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f\": rpc error: code = NotFound desc = could not find container \"066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f\": container with ID starting with 066ae0d743d075af7a9ed71b93ef92bf85ed7c13533a51f4b394bf8ff5a9947f not found: ID does not exist" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.367273 4834 scope.go:117] "RemoveContainer" containerID="2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c" Nov 26 12:29:22 crc kubenswrapper[4834]: E1126 12:29:22.369654 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c\": container with ID starting with 2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c not found: ID does not exist" containerID="2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.369680 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c"} err="failed to get container status \"2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c\": rpc error: code = NotFound desc = could not find container \"2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c\": container with ID starting with 2cf4d387c1a4e57ac5baff414c6ccde80345483004296ab0a883bbfa2f37d33c not found: ID does not exist" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.425821 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25058ace-b8d8-4ac1-924a-844f3955f0fb" path="/var/lib/kubelet/pods/25058ace-b8d8-4ac1-924a-844f3955f0fb/volumes" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.426555 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f30c5fe-7895-474e-a94d-967b23650025" path="/var/lib/kubelet/pods/3f30c5fe-7895-474e-a94d-967b23650025/volumes" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447360 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447395 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447422 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c521a82-8cae-4279-b12f-958ce3470c54-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447439 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447542 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjhs\" (UniqueName: \"kubernetes.io/projected/3c521a82-8cae-4279-b12f-958ce3470c54-kube-api-access-ctjhs\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447586 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447629 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447670 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54gpt\" (UniqueName: \"kubernetes.io/projected/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-kube-api-access-54gpt\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447690 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447705 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447773 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c521a82-8cae-4279-b12f-958ce3470c54-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447791 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c521a82-8cae-4279-b12f-958ce3470c54-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447893 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c521a82-8cae-4279-b12f-958ce3470c54-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447930 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.447965 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.448019 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c521a82-8cae-4279-b12f-958ce3470c54-config-data\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.448057 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.448081 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.448107 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.448125 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.448194 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.448246 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549400 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c521a82-8cae-4279-b12f-958ce3470c54-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549451 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549499 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549563 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c521a82-8cae-4279-b12f-958ce3470c54-config-data\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549600 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549622 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549649 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549667 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549698 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550217 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.549844 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550439 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c521a82-8cae-4279-b12f-958ce3470c54-config-data\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550523 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550574 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550639 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c521a82-8cae-4279-b12f-958ce3470c54-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550660 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550707 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctjhs\" (UniqueName: \"kubernetes.io/projected/3c521a82-8cae-4279-b12f-958ce3470c54-kube-api-access-ctjhs\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550721 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550788 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.550844 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551229 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551327 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551367 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c521a82-8cae-4279-b12f-958ce3470c54-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551383 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551305 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551470 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54gpt\" (UniqueName: \"kubernetes.io/projected/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-kube-api-access-54gpt\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551493 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551513 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551563 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551566 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c521a82-8cae-4279-b12f-958ce3470c54-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551615 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c521a82-8cae-4279-b12f-958ce3470c54-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.551881 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.552151 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.552705 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c521a82-8cae-4279-b12f-958ce3470c54-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.554471 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.554580 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c521a82-8cae-4279-b12f-958ce3470c54-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.555218 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.555421 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.555727 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.556004 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c521a82-8cae-4279-b12f-958ce3470c54-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.556921 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.557355 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c521a82-8cae-4279-b12f-958ce3470c54-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.565467 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctjhs\" (UniqueName: \"kubernetes.io/projected/3c521a82-8cae-4279-b12f-958ce3470c54-kube-api-access-ctjhs\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.566250 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54gpt\" (UniqueName: \"kubernetes.io/projected/0aa91ce0-4843-4e7b-b02c-4cc94d001abd-kube-api-access-54gpt\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.572948 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0aa91ce0-4843-4e7b-b02c-4cc94d001abd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.573218 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3c521a82-8cae-4279-b12f-958ce3470c54\") " pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.636749 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 12:29:22 crc kubenswrapper[4834]: I1126 12:29:22.643114 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.029397 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.086206 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 12:29:23 crc kubenswrapper[4834]: W1126 12:29:23.099299 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c521a82_8cae_4279_b12f_958ce3470c54.slice/crio-27d221bc94e8ca3dc24b59044b068bec0ad53af26f873dca59d7bf2fbbc54c96 WatchSource:0}: Error finding container 27d221bc94e8ca3dc24b59044b068bec0ad53af26f873dca59d7bf2fbbc54c96: Status 404 returned error can't find the container with id 27d221bc94e8ca3dc24b59044b068bec0ad53af26f873dca59d7bf2fbbc54c96 Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.221018 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3c521a82-8cae-4279-b12f-958ce3470c54","Type":"ContainerStarted","Data":"27d221bc94e8ca3dc24b59044b068bec0ad53af26f873dca59d7bf2fbbc54c96"} Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.223401 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0aa91ce0-4843-4e7b-b02c-4cc94d001abd","Type":"ContainerStarted","Data":"78fc0e32d1a5d12f6bbc87abcd91d1208f13f96502de56e7cac7b7ad5017ea1a"} Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.580101 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64b6dd64c5-zp8xf"] Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.581802 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.585694 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.595212 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b6dd64c5-zp8xf"] Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.669846 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-openstack-edpm-ipam\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.669944 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgbv6\" (UniqueName: \"kubernetes.io/projected/b25efa7e-a676-4836-984d-0e88ad97afe1-kube-api-access-hgbv6\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.670044 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-config\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.670063 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-sb\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.670199 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-dns-svc\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.670414 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-nb\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.771756 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-config\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.771793 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-sb\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.771823 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-dns-svc\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.771858 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-nb\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.771888 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-openstack-edpm-ipam\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.771938 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgbv6\" (UniqueName: \"kubernetes.io/projected/b25efa7e-a676-4836-984d-0e88ad97afe1-kube-api-access-hgbv6\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.772885 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-config\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.773574 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-nb\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.773618 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-openstack-edpm-ipam\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.773820 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-dns-svc\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.773850 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-sb\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.883250 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgbv6\" (UniqueName: \"kubernetes.io/projected/b25efa7e-a676-4836-984d-0e88ad97afe1-kube-api-access-hgbv6\") pod \"dnsmasq-dns-64b6dd64c5-zp8xf\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:23 crc kubenswrapper[4834]: I1126 12:29:23.896790 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:24 crc kubenswrapper[4834]: I1126 12:29:24.232592 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0aa91ce0-4843-4e7b-b02c-4cc94d001abd","Type":"ContainerStarted","Data":"759cdc4b404f9873b067cdad63ba06ebb763ea02f3826b9f9a0f8ad57607f785"} Nov 26 12:29:24 crc kubenswrapper[4834]: I1126 12:29:24.354001 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b6dd64c5-zp8xf"] Nov 26 12:29:24 crc kubenswrapper[4834]: W1126 12:29:24.355474 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb25efa7e_a676_4836_984d_0e88ad97afe1.slice/crio-018997de77b7041043874e9a331750e00313c1eda5d4b34af725ca50b8a0334b WatchSource:0}: Error finding container 018997de77b7041043874e9a331750e00313c1eda5d4b34af725ca50b8a0334b: Status 404 returned error can't find the container with id 018997de77b7041043874e9a331750e00313c1eda5d4b34af725ca50b8a0334b Nov 26 12:29:25 crc kubenswrapper[4834]: I1126 12:29:25.241714 4834 generic.go:334] "Generic (PLEG): container finished" podID="b25efa7e-a676-4836-984d-0e88ad97afe1" containerID="4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f" exitCode=0 Nov 26 12:29:25 crc kubenswrapper[4834]: I1126 12:29:25.241767 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" event={"ID":"b25efa7e-a676-4836-984d-0e88ad97afe1","Type":"ContainerDied","Data":"4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f"} Nov 26 12:29:25 crc kubenswrapper[4834]: I1126 12:29:25.242117 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" event={"ID":"b25efa7e-a676-4836-984d-0e88ad97afe1","Type":"ContainerStarted","Data":"018997de77b7041043874e9a331750e00313c1eda5d4b34af725ca50b8a0334b"} Nov 26 12:29:25 crc kubenswrapper[4834]: I1126 12:29:25.245673 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3c521a82-8cae-4279-b12f-958ce3470c54","Type":"ContainerStarted","Data":"fabef56e1c4f98d51e205053ab52bd9c6e25a1819b9a034bfd17d4ac882fca0d"} Nov 26 12:29:26 crc kubenswrapper[4834]: I1126 12:29:26.254945 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" event={"ID":"b25efa7e-a676-4836-984d-0e88ad97afe1","Type":"ContainerStarted","Data":"92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794"} Nov 26 12:29:26 crc kubenswrapper[4834]: I1126 12:29:26.273945 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" podStartSLOduration=3.27393203 podStartE2EDuration="3.27393203s" podCreationTimestamp="2025-11-26 12:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:29:26.269784441 +0000 UTC m=+1064.176997794" watchObservedRunningTime="2025-11-26 12:29:26.27393203 +0000 UTC m=+1064.181145382" Nov 26 12:29:27 crc kubenswrapper[4834]: I1126 12:29:27.260961 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:33 crc kubenswrapper[4834]: I1126 12:29:33.899057 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:33 crc kubenswrapper[4834]: I1126 12:29:33.943900 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f95c456cf-mwjr5"] Nov 26 12:29:33 crc kubenswrapper[4834]: I1126 12:29:33.944124 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" podUID="99a2b0e3-787c-4555-9592-9ad76eeedc7e" containerName="dnsmasq-dns" containerID="cri-o://5621be9dfa21086b1a61bc8893995cd7d88c61eb94cf980cb393cbadd0e95612" gracePeriod=10 Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.036465 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c58867b6c-24msd"] Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.039890 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.044914 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c58867b6c-24msd"] Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.151854 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-nb\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.151905 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-openstack-edpm-ipam\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.151943 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-dns-svc\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.152019 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-config\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.152051 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-sb\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.152070 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zv7x\" (UniqueName: \"kubernetes.io/projected/f93c6d5e-760a-4fcf-881f-8132bc217c3d-kube-api-access-9zv7x\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.253993 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-nb\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.254049 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-openstack-edpm-ipam\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.254085 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-dns-svc\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.254126 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-config\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.254151 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-sb\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.254167 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zv7x\" (UniqueName: \"kubernetes.io/projected/f93c6d5e-760a-4fcf-881f-8132bc217c3d-kube-api-access-9zv7x\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.255064 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-nb\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.255292 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-sb\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.255334 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-openstack-edpm-ipam\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.255424 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-dns-svc\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.256479 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-config\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.272469 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zv7x\" (UniqueName: \"kubernetes.io/projected/f93c6d5e-760a-4fcf-881f-8132bc217c3d-kube-api-access-9zv7x\") pod \"dnsmasq-dns-c58867b6c-24msd\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.309501 4834 generic.go:334] "Generic (PLEG): container finished" podID="99a2b0e3-787c-4555-9592-9ad76eeedc7e" containerID="5621be9dfa21086b1a61bc8893995cd7d88c61eb94cf980cb393cbadd0e95612" exitCode=0 Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.309540 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" event={"ID":"99a2b0e3-787c-4555-9592-9ad76eeedc7e","Type":"ContainerDied","Data":"5621be9dfa21086b1a61bc8893995cd7d88c61eb94cf980cb393cbadd0e95612"} Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.309566 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" event={"ID":"99a2b0e3-787c-4555-9592-9ad76eeedc7e","Type":"ContainerDied","Data":"ab1eddd489dc1f7f823f5d211a6866e68c3a985215e079cd30a1120884648f3a"} Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.309575 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab1eddd489dc1f7f823f5d211a6866e68c3a985215e079cd30a1120884648f3a" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.335759 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.393983 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.459874 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-nb\") pod \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.460175 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znq2h\" (UniqueName: \"kubernetes.io/projected/99a2b0e3-787c-4555-9592-9ad76eeedc7e-kube-api-access-znq2h\") pod \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.460236 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-dns-svc\") pod \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.460335 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-config\") pod \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.460439 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-sb\") pod \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\" (UID: \"99a2b0e3-787c-4555-9592-9ad76eeedc7e\") " Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.467461 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a2b0e3-787c-4555-9592-9ad76eeedc7e-kube-api-access-znq2h" (OuterVolumeSpecName: "kube-api-access-znq2h") pod "99a2b0e3-787c-4555-9592-9ad76eeedc7e" (UID: "99a2b0e3-787c-4555-9592-9ad76eeedc7e"). InnerVolumeSpecName "kube-api-access-znq2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.523766 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99a2b0e3-787c-4555-9592-9ad76eeedc7e" (UID: "99a2b0e3-787c-4555-9592-9ad76eeedc7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.529341 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-config" (OuterVolumeSpecName: "config") pod "99a2b0e3-787c-4555-9592-9ad76eeedc7e" (UID: "99a2b0e3-787c-4555-9592-9ad76eeedc7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.530561 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99a2b0e3-787c-4555-9592-9ad76eeedc7e" (UID: "99a2b0e3-787c-4555-9592-9ad76eeedc7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.533904 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99a2b0e3-787c-4555-9592-9ad76eeedc7e" (UID: "99a2b0e3-787c-4555-9592-9ad76eeedc7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.568252 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.568285 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.568296 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znq2h\" (UniqueName: \"kubernetes.io/projected/99a2b0e3-787c-4555-9592-9ad76eeedc7e-kube-api-access-znq2h\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.568325 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.568341 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a2b0e3-787c-4555-9592-9ad76eeedc7e-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:34 crc kubenswrapper[4834]: I1126 12:29:34.825216 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c58867b6c-24msd"] Nov 26 12:29:34 crc kubenswrapper[4834]: W1126 12:29:34.825741 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf93c6d5e_760a_4fcf_881f_8132bc217c3d.slice/crio-203a1eb32a0c959b7d7d851b78df2738592699ccb30e8a092cc3ff800430df3e WatchSource:0}: Error finding container 203a1eb32a0c959b7d7d851b78df2738592699ccb30e8a092cc3ff800430df3e: Status 404 returned error can't find the container with id 203a1eb32a0c959b7d7d851b78df2738592699ccb30e8a092cc3ff800430df3e Nov 26 12:29:35 crc kubenswrapper[4834]: I1126 12:29:35.317422 4834 generic.go:334] "Generic (PLEG): container finished" podID="f93c6d5e-760a-4fcf-881f-8132bc217c3d" containerID="09cad6b18fe8864a06876896a5d12bb81658c3f8cf28586701f9f600852ae922" exitCode=0 Nov 26 12:29:35 crc kubenswrapper[4834]: I1126 12:29:35.317467 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c58867b6c-24msd" event={"ID":"f93c6d5e-760a-4fcf-881f-8132bc217c3d","Type":"ContainerDied","Data":"09cad6b18fe8864a06876896a5d12bb81658c3f8cf28586701f9f600852ae922"} Nov 26 12:29:35 crc kubenswrapper[4834]: I1126 12:29:35.317711 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f95c456cf-mwjr5" Nov 26 12:29:35 crc kubenswrapper[4834]: I1126 12:29:35.317732 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c58867b6c-24msd" event={"ID":"f93c6d5e-760a-4fcf-881f-8132bc217c3d","Type":"ContainerStarted","Data":"203a1eb32a0c959b7d7d851b78df2738592699ccb30e8a092cc3ff800430df3e"} Nov 26 12:29:35 crc kubenswrapper[4834]: I1126 12:29:35.455827 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f95c456cf-mwjr5"] Nov 26 12:29:35 crc kubenswrapper[4834]: I1126 12:29:35.462328 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f95c456cf-mwjr5"] Nov 26 12:29:36 crc kubenswrapper[4834]: I1126 12:29:36.325908 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c58867b6c-24msd" event={"ID":"f93c6d5e-760a-4fcf-881f-8132bc217c3d","Type":"ContainerStarted","Data":"bb910519d02a4fb561360c59bf461b5e6c9ac9404ecea01044b038c46368c939"} Nov 26 12:29:36 crc kubenswrapper[4834]: I1126 12:29:36.326252 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:36 crc kubenswrapper[4834]: I1126 12:29:36.341380 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c58867b6c-24msd" podStartSLOduration=2.341363593 podStartE2EDuration="2.341363593s" podCreationTimestamp="2025-11-26 12:29:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:29:36.339554852 +0000 UTC m=+1074.246768205" watchObservedRunningTime="2025-11-26 12:29:36.341363593 +0000 UTC m=+1074.248576944" Nov 26 12:29:36 crc kubenswrapper[4834]: I1126 12:29:36.424905 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a2b0e3-787c-4555-9592-9ad76eeedc7e" path="/var/lib/kubelet/pods/99a2b0e3-787c-4555-9592-9ad76eeedc7e/volumes" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.395458 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.440390 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b6dd64c5-zp8xf"] Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.440650 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" podUID="b25efa7e-a676-4836-984d-0e88ad97afe1" containerName="dnsmasq-dns" containerID="cri-o://92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794" gracePeriod=10 Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.790197 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.822492 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-config\") pod \"b25efa7e-a676-4836-984d-0e88ad97afe1\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.822572 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgbv6\" (UniqueName: \"kubernetes.io/projected/b25efa7e-a676-4836-984d-0e88ad97afe1-kube-api-access-hgbv6\") pod \"b25efa7e-a676-4836-984d-0e88ad97afe1\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.822645 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-dns-svc\") pod \"b25efa7e-a676-4836-984d-0e88ad97afe1\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.822678 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-sb\") pod \"b25efa7e-a676-4836-984d-0e88ad97afe1\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.822710 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-openstack-edpm-ipam\") pod \"b25efa7e-a676-4836-984d-0e88ad97afe1\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.822729 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-nb\") pod \"b25efa7e-a676-4836-984d-0e88ad97afe1\" (UID: \"b25efa7e-a676-4836-984d-0e88ad97afe1\") " Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.829254 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b25efa7e-a676-4836-984d-0e88ad97afe1-kube-api-access-hgbv6" (OuterVolumeSpecName: "kube-api-access-hgbv6") pod "b25efa7e-a676-4836-984d-0e88ad97afe1" (UID: "b25efa7e-a676-4836-984d-0e88ad97afe1"). InnerVolumeSpecName "kube-api-access-hgbv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.863758 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b25efa7e-a676-4836-984d-0e88ad97afe1" (UID: "b25efa7e-a676-4836-984d-0e88ad97afe1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.865461 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b25efa7e-a676-4836-984d-0e88ad97afe1" (UID: "b25efa7e-a676-4836-984d-0e88ad97afe1"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.865676 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b25efa7e-a676-4836-984d-0e88ad97afe1" (UID: "b25efa7e-a676-4836-984d-0e88ad97afe1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.874295 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b25efa7e-a676-4836-984d-0e88ad97afe1" (UID: "b25efa7e-a676-4836-984d-0e88ad97afe1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.881767 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-config" (OuterVolumeSpecName: "config") pod "b25efa7e-a676-4836-984d-0e88ad97afe1" (UID: "b25efa7e-a676-4836-984d-0e88ad97afe1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.924011 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgbv6\" (UniqueName: \"kubernetes.io/projected/b25efa7e-a676-4836-984d-0e88ad97afe1-kube-api-access-hgbv6\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.924035 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.924045 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.924054 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.924062 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:44 crc kubenswrapper[4834]: I1126 12:29:44.924070 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b25efa7e-a676-4836-984d-0e88ad97afe1-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.385605 4834 generic.go:334] "Generic (PLEG): container finished" podID="b25efa7e-a676-4836-984d-0e88ad97afe1" containerID="92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794" exitCode=0 Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.385702 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.385727 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" event={"ID":"b25efa7e-a676-4836-984d-0e88ad97afe1","Type":"ContainerDied","Data":"92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794"} Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.385985 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b6dd64c5-zp8xf" event={"ID":"b25efa7e-a676-4836-984d-0e88ad97afe1","Type":"ContainerDied","Data":"018997de77b7041043874e9a331750e00313c1eda5d4b34af725ca50b8a0334b"} Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.386008 4834 scope.go:117] "RemoveContainer" containerID="92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794" Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.403664 4834 scope.go:117] "RemoveContainer" containerID="4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f" Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.418963 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b6dd64c5-zp8xf"] Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.422720 4834 scope.go:117] "RemoveContainer" containerID="92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794" Nov 26 12:29:45 crc kubenswrapper[4834]: E1126 12:29:45.423172 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794\": container with ID starting with 92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794 not found: ID does not exist" containerID="92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794" Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.423206 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794"} err="failed to get container status \"92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794\": rpc error: code = NotFound desc = could not find container \"92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794\": container with ID starting with 92362625d866b2d60f4786b0ca0d89b3f1443aa92bafff09a2c4f816d7cd3794 not found: ID does not exist" Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.423227 4834 scope.go:117] "RemoveContainer" containerID="4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f" Nov 26 12:29:45 crc kubenswrapper[4834]: E1126 12:29:45.423556 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f\": container with ID starting with 4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f not found: ID does not exist" containerID="4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f" Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.423580 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f"} err="failed to get container status \"4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f\": rpc error: code = NotFound desc = could not find container \"4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f\": container with ID starting with 4aca25adf506bdef7853cca7619e271d6a8444b053b1adf6d18581deff1a131f not found: ID does not exist" Nov 26 12:29:45 crc kubenswrapper[4834]: I1126 12:29:45.427733 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64b6dd64c5-zp8xf"] Nov 26 12:29:46 crc kubenswrapper[4834]: I1126 12:29:46.424491 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b25efa7e-a676-4836-984d-0e88ad97afe1" path="/var/lib/kubelet/pods/b25efa7e-a676-4836-984d-0e88ad97afe1/volumes" Nov 26 12:29:51 crc kubenswrapper[4834]: I1126 12:29:51.531656 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:29:51 crc kubenswrapper[4834]: I1126 12:29:51.532341 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.559560 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2"] Nov 26 12:29:54 crc kubenswrapper[4834]: E1126 12:29:54.560376 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25efa7e-a676-4836-984d-0e88ad97afe1" containerName="dnsmasq-dns" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.560392 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25efa7e-a676-4836-984d-0e88ad97afe1" containerName="dnsmasq-dns" Nov 26 12:29:54 crc kubenswrapper[4834]: E1126 12:29:54.560410 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25efa7e-a676-4836-984d-0e88ad97afe1" containerName="init" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.560416 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25efa7e-a676-4836-984d-0e88ad97afe1" containerName="init" Nov 26 12:29:54 crc kubenswrapper[4834]: E1126 12:29:54.560427 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a2b0e3-787c-4555-9592-9ad76eeedc7e" containerName="init" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.560434 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a2b0e3-787c-4555-9592-9ad76eeedc7e" containerName="init" Nov 26 12:29:54 crc kubenswrapper[4834]: E1126 12:29:54.560440 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a2b0e3-787c-4555-9592-9ad76eeedc7e" containerName="dnsmasq-dns" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.560445 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a2b0e3-787c-4555-9592-9ad76eeedc7e" containerName="dnsmasq-dns" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.560658 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a2b0e3-787c-4555-9592-9ad76eeedc7e" containerName="dnsmasq-dns" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.560686 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b25efa7e-a676-4836-984d-0e88ad97afe1" containerName="dnsmasq-dns" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.561446 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.563612 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.563622 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.564205 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.564327 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.569602 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2"] Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.602888 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.602950 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.602989 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5dxb\" (UniqueName: \"kubernetes.io/projected/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-kube-api-access-b5dxb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.603082 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.705472 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.705594 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.705636 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.705672 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5dxb\" (UniqueName: \"kubernetes.io/projected/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-kube-api-access-b5dxb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.711739 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.712347 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.712507 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.721800 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5dxb\" (UniqueName: \"kubernetes.io/projected/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-kube-api-access-b5dxb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:54 crc kubenswrapper[4834]: I1126 12:29:54.889460 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:29:55 crc kubenswrapper[4834]: I1126 12:29:55.389949 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2"] Nov 26 12:29:55 crc kubenswrapper[4834]: W1126 12:29:55.393990 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71c9f98e_072f_4f6b_928e_3ea0d2f44cc4.slice/crio-af7a96c245bd80b7dbc74beea07092ef83e2ce0be8427d24fa1d053bef69e295 WatchSource:0}: Error finding container af7a96c245bd80b7dbc74beea07092ef83e2ce0be8427d24fa1d053bef69e295: Status 404 returned error can't find the container with id af7a96c245bd80b7dbc74beea07092ef83e2ce0be8427d24fa1d053bef69e295 Nov 26 12:29:55 crc kubenswrapper[4834]: I1126 12:29:55.461617 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" event={"ID":"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4","Type":"ContainerStarted","Data":"af7a96c245bd80b7dbc74beea07092ef83e2ce0be8427d24fa1d053bef69e295"} Nov 26 12:29:56 crc kubenswrapper[4834]: I1126 12:29:56.471472 4834 generic.go:334] "Generic (PLEG): container finished" podID="0aa91ce0-4843-4e7b-b02c-4cc94d001abd" containerID="759cdc4b404f9873b067cdad63ba06ebb763ea02f3826b9f9a0f8ad57607f785" exitCode=0 Nov 26 12:29:56 crc kubenswrapper[4834]: I1126 12:29:56.471571 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0aa91ce0-4843-4e7b-b02c-4cc94d001abd","Type":"ContainerDied","Data":"759cdc4b404f9873b067cdad63ba06ebb763ea02f3826b9f9a0f8ad57607f785"} Nov 26 12:29:56 crc kubenswrapper[4834]: I1126 12:29:56.475484 4834 generic.go:334] "Generic (PLEG): container finished" podID="3c521a82-8cae-4279-b12f-958ce3470c54" containerID="fabef56e1c4f98d51e205053ab52bd9c6e25a1819b9a034bfd17d4ac882fca0d" exitCode=0 Nov 26 12:29:56 crc kubenswrapper[4834]: I1126 12:29:56.475525 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3c521a82-8cae-4279-b12f-958ce3470c54","Type":"ContainerDied","Data":"fabef56e1c4f98d51e205053ab52bd9c6e25a1819b9a034bfd17d4ac882fca0d"} Nov 26 12:29:57 crc kubenswrapper[4834]: I1126 12:29:57.489908 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3c521a82-8cae-4279-b12f-958ce3470c54","Type":"ContainerStarted","Data":"f2f3770feb594b2053875202d2a4d2a040a741455ce4a2c595bf09347cc8f41b"} Nov 26 12:29:57 crc kubenswrapper[4834]: I1126 12:29:57.491532 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 26 12:29:57 crc kubenswrapper[4834]: I1126 12:29:57.496527 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0aa91ce0-4843-4e7b-b02c-4cc94d001abd","Type":"ContainerStarted","Data":"bc50e732c974278e82734fbe354e3dc630df46af75c18c4005c9cfa2dd587ff9"} Nov 26 12:29:57 crc kubenswrapper[4834]: I1126 12:29:57.497392 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:29:57 crc kubenswrapper[4834]: I1126 12:29:57.520651 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.520638673 podStartE2EDuration="35.520638673s" podCreationTimestamp="2025-11-26 12:29:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:29:57.512161694 +0000 UTC m=+1095.419375047" watchObservedRunningTime="2025-11-26 12:29:57.520638673 +0000 UTC m=+1095.427852026" Nov 26 12:29:57 crc kubenswrapper[4834]: I1126 12:29:57.534302 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.534292511 podStartE2EDuration="35.534292511s" podCreationTimestamp="2025-11-26 12:29:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:29:57.532963887 +0000 UTC m=+1095.440177238" watchObservedRunningTime="2025-11-26 12:29:57.534292511 +0000 UTC m=+1095.441505862" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.166019 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7"] Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.167742 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.174540 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7"] Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.174712 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.174843 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.347087 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df8c45-278a-4645-b9c0-9ee6ced3f966-secret-volume\") pod \"collect-profiles-29402670-2w2l7\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.347187 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df8c45-278a-4645-b9c0-9ee6ced3f966-config-volume\") pod \"collect-profiles-29402670-2w2l7\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.347697 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2llk7\" (UniqueName: \"kubernetes.io/projected/06df8c45-278a-4645-b9c0-9ee6ced3f966-kube-api-access-2llk7\") pod \"collect-profiles-29402670-2w2l7\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.450350 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df8c45-278a-4645-b9c0-9ee6ced3f966-secret-volume\") pod \"collect-profiles-29402670-2w2l7\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.450434 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df8c45-278a-4645-b9c0-9ee6ced3f966-config-volume\") pod \"collect-profiles-29402670-2w2l7\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.450706 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2llk7\" (UniqueName: \"kubernetes.io/projected/06df8c45-278a-4645-b9c0-9ee6ced3f966-kube-api-access-2llk7\") pod \"collect-profiles-29402670-2w2l7\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.452849 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df8c45-278a-4645-b9c0-9ee6ced3f966-config-volume\") pod \"collect-profiles-29402670-2w2l7\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.456504 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df8c45-278a-4645-b9c0-9ee6ced3f966-secret-volume\") pod \"collect-profiles-29402670-2w2l7\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.464895 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2llk7\" (UniqueName: \"kubernetes.io/projected/06df8c45-278a-4645-b9c0-9ee6ced3f966-kube-api-access-2llk7\") pod \"collect-profiles-29402670-2w2l7\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:00 crc kubenswrapper[4834]: I1126 12:30:00.487030 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:03 crc kubenswrapper[4834]: I1126 12:30:03.496512 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7"] Nov 26 12:30:03 crc kubenswrapper[4834]: W1126 12:30:03.500477 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06df8c45_278a_4645_b9c0_9ee6ced3f966.slice/crio-20202a806326787315c63f42d04e7a7ed17138d08d568c5f84ba7fc10da186de WatchSource:0}: Error finding container 20202a806326787315c63f42d04e7a7ed17138d08d568c5f84ba7fc10da186de: Status 404 returned error can't find the container with id 20202a806326787315c63f42d04e7a7ed17138d08d568c5f84ba7fc10da186de Nov 26 12:30:03 crc kubenswrapper[4834]: I1126 12:30:03.554611 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" event={"ID":"06df8c45-278a-4645-b9c0-9ee6ced3f966","Type":"ContainerStarted","Data":"20202a806326787315c63f42d04e7a7ed17138d08d568c5f84ba7fc10da186de"} Nov 26 12:30:03 crc kubenswrapper[4834]: I1126 12:30:03.556452 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" event={"ID":"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4","Type":"ContainerStarted","Data":"616698bbc8391482d68210a45c8ea38b76f177bfef0a42a6a9b8bc2e4144faa5"} Nov 26 12:30:03 crc kubenswrapper[4834]: I1126 12:30:03.577816 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" podStartSLOduration=1.8831363209999998 podStartE2EDuration="9.577796925s" podCreationTimestamp="2025-11-26 12:29:54 +0000 UTC" firstStartedPulling="2025-11-26 12:29:55.395866757 +0000 UTC m=+1093.303080109" lastFinishedPulling="2025-11-26 12:30:03.090527361 +0000 UTC m=+1100.997740713" observedRunningTime="2025-11-26 12:30:03.57005438 +0000 UTC m=+1101.477267732" watchObservedRunningTime="2025-11-26 12:30:03.577796925 +0000 UTC m=+1101.485010276" Nov 26 12:30:04 crc kubenswrapper[4834]: I1126 12:30:04.567710 4834 generic.go:334] "Generic (PLEG): container finished" podID="06df8c45-278a-4645-b9c0-9ee6ced3f966" containerID="22b4199dcef25d564763de3f7a9cc599fb9d1c4f841c114940f833193fb49567" exitCode=0 Nov 26 12:30:04 crc kubenswrapper[4834]: I1126 12:30:04.567824 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" event={"ID":"06df8c45-278a-4645-b9c0-9ee6ced3f966","Type":"ContainerDied","Data":"22b4199dcef25d564763de3f7a9cc599fb9d1c4f841c114940f833193fb49567"} Nov 26 12:30:05 crc kubenswrapper[4834]: I1126 12:30:05.837106 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:05 crc kubenswrapper[4834]: I1126 12:30:05.964808 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2llk7\" (UniqueName: \"kubernetes.io/projected/06df8c45-278a-4645-b9c0-9ee6ced3f966-kube-api-access-2llk7\") pod \"06df8c45-278a-4645-b9c0-9ee6ced3f966\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " Nov 26 12:30:05 crc kubenswrapper[4834]: I1126 12:30:05.965192 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df8c45-278a-4645-b9c0-9ee6ced3f966-config-volume\") pod \"06df8c45-278a-4645-b9c0-9ee6ced3f966\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " Nov 26 12:30:05 crc kubenswrapper[4834]: I1126 12:30:05.965300 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df8c45-278a-4645-b9c0-9ee6ced3f966-secret-volume\") pod \"06df8c45-278a-4645-b9c0-9ee6ced3f966\" (UID: \"06df8c45-278a-4645-b9c0-9ee6ced3f966\") " Nov 26 12:30:05 crc kubenswrapper[4834]: I1126 12:30:05.966202 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06df8c45-278a-4645-b9c0-9ee6ced3f966-config-volume" (OuterVolumeSpecName: "config-volume") pod "06df8c45-278a-4645-b9c0-9ee6ced3f966" (UID: "06df8c45-278a-4645-b9c0-9ee6ced3f966"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:30:05 crc kubenswrapper[4834]: I1126 12:30:05.971224 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06df8c45-278a-4645-b9c0-9ee6ced3f966-kube-api-access-2llk7" (OuterVolumeSpecName: "kube-api-access-2llk7") pod "06df8c45-278a-4645-b9c0-9ee6ced3f966" (UID: "06df8c45-278a-4645-b9c0-9ee6ced3f966"). InnerVolumeSpecName "kube-api-access-2llk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:30:05 crc kubenswrapper[4834]: I1126 12:30:05.971285 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06df8c45-278a-4645-b9c0-9ee6ced3f966-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06df8c45-278a-4645-b9c0-9ee6ced3f966" (UID: "06df8c45-278a-4645-b9c0-9ee6ced3f966"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:30:06 crc kubenswrapper[4834]: I1126 12:30:06.068802 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df8c45-278a-4645-b9c0-9ee6ced3f966-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 12:30:06 crc kubenswrapper[4834]: I1126 12:30:06.068851 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df8c45-278a-4645-b9c0-9ee6ced3f966-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 12:30:06 crc kubenswrapper[4834]: I1126 12:30:06.068868 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2llk7\" (UniqueName: \"kubernetes.io/projected/06df8c45-278a-4645-b9c0-9ee6ced3f966-kube-api-access-2llk7\") on node \"crc\" DevicePath \"\"" Nov 26 12:30:06 crc kubenswrapper[4834]: I1126 12:30:06.582370 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" event={"ID":"06df8c45-278a-4645-b9c0-9ee6ced3f966","Type":"ContainerDied","Data":"20202a806326787315c63f42d04e7a7ed17138d08d568c5f84ba7fc10da186de"} Nov 26 12:30:06 crc kubenswrapper[4834]: I1126 12:30:06.582412 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20202a806326787315c63f42d04e7a7ed17138d08d568c5f84ba7fc10da186de" Nov 26 12:30:06 crc kubenswrapper[4834]: I1126 12:30:06.582413 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402670-2w2l7" Nov 26 12:30:12 crc kubenswrapper[4834]: I1126 12:30:12.640457 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 26 12:30:12 crc kubenswrapper[4834]: I1126 12:30:12.644795 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 26 12:30:15 crc kubenswrapper[4834]: I1126 12:30:15.645195 4834 generic.go:334] "Generic (PLEG): container finished" podID="71c9f98e-072f-4f6b-928e-3ea0d2f44cc4" containerID="616698bbc8391482d68210a45c8ea38b76f177bfef0a42a6a9b8bc2e4144faa5" exitCode=0 Nov 26 12:30:15 crc kubenswrapper[4834]: I1126 12:30:15.645265 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" event={"ID":"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4","Type":"ContainerDied","Data":"616698bbc8391482d68210a45c8ea38b76f177bfef0a42a6a9b8bc2e4144faa5"} Nov 26 12:30:16 crc kubenswrapper[4834]: I1126 12:30:16.962337 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.059525 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-inventory\") pod \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.059567 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-repo-setup-combined-ca-bundle\") pod \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.059627 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-ssh-key\") pod \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.059719 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5dxb\" (UniqueName: \"kubernetes.io/projected/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-kube-api-access-b5dxb\") pod \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\" (UID: \"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4\") " Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.064867 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-kube-api-access-b5dxb" (OuterVolumeSpecName: "kube-api-access-b5dxb") pod "71c9f98e-072f-4f6b-928e-3ea0d2f44cc4" (UID: "71c9f98e-072f-4f6b-928e-3ea0d2f44cc4"). InnerVolumeSpecName "kube-api-access-b5dxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.065495 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "71c9f98e-072f-4f6b-928e-3ea0d2f44cc4" (UID: "71c9f98e-072f-4f6b-928e-3ea0d2f44cc4"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.080614 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-inventory" (OuterVolumeSpecName: "inventory") pod "71c9f98e-072f-4f6b-928e-3ea0d2f44cc4" (UID: "71c9f98e-072f-4f6b-928e-3ea0d2f44cc4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.082199 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "71c9f98e-072f-4f6b-928e-3ea0d2f44cc4" (UID: "71c9f98e-072f-4f6b-928e-3ea0d2f44cc4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.162303 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.162596 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5dxb\" (UniqueName: \"kubernetes.io/projected/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-kube-api-access-b5dxb\") on node \"crc\" DevicePath \"\"" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.162608 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.162616 4834 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.660120 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" event={"ID":"71c9f98e-072f-4f6b-928e-3ea0d2f44cc4","Type":"ContainerDied","Data":"af7a96c245bd80b7dbc74beea07092ef83e2ce0be8427d24fa1d053bef69e295"} Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.660159 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af7a96c245bd80b7dbc74beea07092ef83e2ce0be8427d24fa1d053bef69e295" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.660169 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.717297 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w"] Nov 26 12:30:17 crc kubenswrapper[4834]: E1126 12:30:17.717676 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06df8c45-278a-4645-b9c0-9ee6ced3f966" containerName="collect-profiles" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.717692 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="06df8c45-278a-4645-b9c0-9ee6ced3f966" containerName="collect-profiles" Nov 26 12:30:17 crc kubenswrapper[4834]: E1126 12:30:17.717710 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c9f98e-072f-4f6b-928e-3ea0d2f44cc4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.717717 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c9f98e-072f-4f6b-928e-3ea0d2f44cc4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.717881 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c9f98e-072f-4f6b-928e-3ea0d2f44cc4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.717907 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="06df8c45-278a-4645-b9c0-9ee6ced3f966" containerName="collect-profiles" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.718474 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.720630 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.720841 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.721024 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.721155 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.729062 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w"] Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.771074 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.771159 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.771203 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h78r7\" (UniqueName: \"kubernetes.io/projected/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-kube-api-access-h78r7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.771279 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.873202 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.873272 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.873443 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.873517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h78r7\" (UniqueName: \"kubernetes.io/projected/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-kube-api-access-h78r7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.877227 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.877548 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.877736 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:17 crc kubenswrapper[4834]: I1126 12:30:17.887047 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h78r7\" (UniqueName: \"kubernetes.io/projected/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-kube-api-access-h78r7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:18 crc kubenswrapper[4834]: I1126 12:30:18.031396 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:30:18 crc kubenswrapper[4834]: W1126 12:30:18.454856 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb41a5cd_cab1_4a34_a435_d5a34b7a21a1.slice/crio-1fa175d0e0c916a0a5ab6fc393c90b9b56b5bece27507831ec326d1714804fce WatchSource:0}: Error finding container 1fa175d0e0c916a0a5ab6fc393c90b9b56b5bece27507831ec326d1714804fce: Status 404 returned error can't find the container with id 1fa175d0e0c916a0a5ab6fc393c90b9b56b5bece27507831ec326d1714804fce Nov 26 12:30:18 crc kubenswrapper[4834]: I1126 12:30:18.455715 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w"] Nov 26 12:30:18 crc kubenswrapper[4834]: I1126 12:30:18.458265 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 12:30:18 crc kubenswrapper[4834]: I1126 12:30:18.668704 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" event={"ID":"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1","Type":"ContainerStarted","Data":"1fa175d0e0c916a0a5ab6fc393c90b9b56b5bece27507831ec326d1714804fce"} Nov 26 12:30:19 crc kubenswrapper[4834]: I1126 12:30:19.678445 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" event={"ID":"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1","Type":"ContainerStarted","Data":"c73626f919d0b959d8d0a2a80e0e8f48efe68993cbeb280f32f357b2a4a390a6"} Nov 26 12:30:19 crc kubenswrapper[4834]: I1126 12:30:19.708542 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" podStartSLOduration=2.180856736 podStartE2EDuration="2.708520808s" podCreationTimestamp="2025-11-26 12:30:17 +0000 UTC" firstStartedPulling="2025-11-26 12:30:18.458046031 +0000 UTC m=+1116.365259383" lastFinishedPulling="2025-11-26 12:30:18.985710104 +0000 UTC m=+1116.892923455" observedRunningTime="2025-11-26 12:30:19.705090829 +0000 UTC m=+1117.612304181" watchObservedRunningTime="2025-11-26 12:30:19.708520808 +0000 UTC m=+1117.615734160" Nov 26 12:30:21 crc kubenswrapper[4834]: I1126 12:30:21.531093 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:30:21 crc kubenswrapper[4834]: I1126 12:30:21.531456 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:30:21 crc kubenswrapper[4834]: I1126 12:30:21.531610 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:30:21 crc kubenswrapper[4834]: I1126 12:30:21.532942 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5856a524b96de1a2b853af8b527df2165a8d3f4d203109c2d5ffa66afdcc60ed"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:30:21 crc kubenswrapper[4834]: I1126 12:30:21.533015 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://5856a524b96de1a2b853af8b527df2165a8d3f4d203109c2d5ffa66afdcc60ed" gracePeriod=600 Nov 26 12:30:21 crc kubenswrapper[4834]: I1126 12:30:21.694998 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="5856a524b96de1a2b853af8b527df2165a8d3f4d203109c2d5ffa66afdcc60ed" exitCode=0 Nov 26 12:30:21 crc kubenswrapper[4834]: I1126 12:30:21.695042 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"5856a524b96de1a2b853af8b527df2165a8d3f4d203109c2d5ffa66afdcc60ed"} Nov 26 12:30:21 crc kubenswrapper[4834]: I1126 12:30:21.695077 4834 scope.go:117] "RemoveContainer" containerID="a30f1ec8aaa63fca565600131ca721203e6bad41e9594f352f74db2095a7e3eb" Nov 26 12:30:22 crc kubenswrapper[4834]: I1126 12:30:22.705029 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"acd09ade57ac546b50450d570a864d3e80d6aed5b0f2fac09427c5ddbb2d4ad1"} Nov 26 12:32:21 crc kubenswrapper[4834]: I1126 12:32:21.531769 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:32:21 crc kubenswrapper[4834]: I1126 12:32:21.532194 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:32:44 crc kubenswrapper[4834]: I1126 12:32:44.138018 4834 scope.go:117] "RemoveContainer" containerID="f09126d337e4fd4bf79ca32776e7c4a20525e37f98c4a8bb1cade6f555b4fd09" Nov 26 12:32:44 crc kubenswrapper[4834]: I1126 12:32:44.158009 4834 scope.go:117] "RemoveContainer" containerID="c720a02f9bff5b293043fe87ba72ad95273d785cbff5ff2a9ef4649e77c00aab" Nov 26 12:32:44 crc kubenswrapper[4834]: I1126 12:32:44.184007 4834 scope.go:117] "RemoveContainer" containerID="71ad9e10c017b08a6c8991a3f5e95d29784904a18ed6e5fd1eee7adf71954d08" Nov 26 12:32:44 crc kubenswrapper[4834]: I1126 12:32:44.219984 4834 scope.go:117] "RemoveContainer" containerID="3cb9146b384da2fc97aabd6065964b95908546eb74a18b646ed19772282e5130" Nov 26 12:32:44 crc kubenswrapper[4834]: I1126 12:32:44.239876 4834 scope.go:117] "RemoveContainer" containerID="3e31d5168665dd1bbdc361ae4db6fe75589cb8a019033458c739cdca0e68ea85" Nov 26 12:32:51 crc kubenswrapper[4834]: I1126 12:32:51.531648 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:32:51 crc kubenswrapper[4834]: I1126 12:32:51.532012 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:33:11 crc kubenswrapper[4834]: I1126 12:33:11.894515 4834 generic.go:334] "Generic (PLEG): container finished" podID="cb41a5cd-cab1-4a34-a435-d5a34b7a21a1" containerID="c73626f919d0b959d8d0a2a80e0e8f48efe68993cbeb280f32f357b2a4a390a6" exitCode=0 Nov 26 12:33:11 crc kubenswrapper[4834]: I1126 12:33:11.894685 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" event={"ID":"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1","Type":"ContainerDied","Data":"c73626f919d0b959d8d0a2a80e0e8f48efe68993cbeb280f32f357b2a4a390a6"} Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.193002 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.320780 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-ssh-key\") pod \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.321073 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h78r7\" (UniqueName: \"kubernetes.io/projected/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-kube-api-access-h78r7\") pod \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.321216 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-inventory\") pod \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.321234 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-bootstrap-combined-ca-bundle\") pod \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\" (UID: \"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1\") " Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.325769 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-kube-api-access-h78r7" (OuterVolumeSpecName: "kube-api-access-h78r7") pod "cb41a5cd-cab1-4a34-a435-d5a34b7a21a1" (UID: "cb41a5cd-cab1-4a34-a435-d5a34b7a21a1"). InnerVolumeSpecName "kube-api-access-h78r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.326216 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "cb41a5cd-cab1-4a34-a435-d5a34b7a21a1" (UID: "cb41a5cd-cab1-4a34-a435-d5a34b7a21a1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.341910 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cb41a5cd-cab1-4a34-a435-d5a34b7a21a1" (UID: "cb41a5cd-cab1-4a34-a435-d5a34b7a21a1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.342179 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-inventory" (OuterVolumeSpecName: "inventory") pod "cb41a5cd-cab1-4a34-a435-d5a34b7a21a1" (UID: "cb41a5cd-cab1-4a34-a435-d5a34b7a21a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.423555 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.423670 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h78r7\" (UniqueName: \"kubernetes.io/projected/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-kube-api-access-h78r7\") on node \"crc\" DevicePath \"\"" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.423687 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.423696 4834 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.908642 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" event={"ID":"cb41a5cd-cab1-4a34-a435-d5a34b7a21a1","Type":"ContainerDied","Data":"1fa175d0e0c916a0a5ab6fc393c90b9b56b5bece27507831ec326d1714804fce"} Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.908675 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fa175d0e0c916a0a5ab6fc393c90b9b56b5bece27507831ec326d1714804fce" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.908856 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.972588 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w"] Nov 26 12:33:13 crc kubenswrapper[4834]: E1126 12:33:13.973000 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb41a5cd-cab1-4a34-a435-d5a34b7a21a1" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.973021 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb41a5cd-cab1-4a34-a435-d5a34b7a21a1" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.973195 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb41a5cd-cab1-4a34-a435-d5a34b7a21a1" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.974143 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.976445 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.976543 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.978173 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.978419 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:33:13 crc kubenswrapper[4834]: I1126 12:33:13.988497 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w"] Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.055046 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cz65w\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.055128 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cz65w\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.055212 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwk6\" (UniqueName: \"kubernetes.io/projected/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-kube-api-access-ccwk6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cz65w\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.156836 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwk6\" (UniqueName: \"kubernetes.io/projected/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-kube-api-access-ccwk6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cz65w\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.156908 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cz65w\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.156970 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cz65w\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.160780 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cz65w\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.160921 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cz65w\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.171378 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwk6\" (UniqueName: \"kubernetes.io/projected/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-kube-api-access-ccwk6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cz65w\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.292624 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.712762 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w"] Nov 26 12:33:14 crc kubenswrapper[4834]: I1126 12:33:14.917099 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" event={"ID":"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c","Type":"ContainerStarted","Data":"7874d8a01549d7c1c977366d674afc62e7b946379df29eea6c7f46beb1347165"} Nov 26 12:33:15 crc kubenswrapper[4834]: I1126 12:33:15.925165 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" event={"ID":"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c","Type":"ContainerStarted","Data":"1f0922cc64244aa912e289bebb22a03b4667f43dcd58cc0a33950ad8aecfe95f"} Nov 26 12:33:15 crc kubenswrapper[4834]: I1126 12:33:15.947438 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" podStartSLOduration=2.167999576 podStartE2EDuration="2.947422767s" podCreationTimestamp="2025-11-26 12:33:13 +0000 UTC" firstStartedPulling="2025-11-26 12:33:14.71483352 +0000 UTC m=+1292.622046871" lastFinishedPulling="2025-11-26 12:33:15.494256709 +0000 UTC m=+1293.401470062" observedRunningTime="2025-11-26 12:33:15.939323031 +0000 UTC m=+1293.846536382" watchObservedRunningTime="2025-11-26 12:33:15.947422767 +0000 UTC m=+1293.854636119" Nov 26 12:33:21 crc kubenswrapper[4834]: I1126 12:33:21.531752 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:33:21 crc kubenswrapper[4834]: I1126 12:33:21.532041 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:33:21 crc kubenswrapper[4834]: I1126 12:33:21.532082 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:33:21 crc kubenswrapper[4834]: I1126 12:33:21.532978 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"acd09ade57ac546b50450d570a864d3e80d6aed5b0f2fac09427c5ddbb2d4ad1"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:33:21 crc kubenswrapper[4834]: I1126 12:33:21.533043 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://acd09ade57ac546b50450d570a864d3e80d6aed5b0f2fac09427c5ddbb2d4ad1" gracePeriod=600 Nov 26 12:33:21 crc kubenswrapper[4834]: I1126 12:33:21.970296 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="acd09ade57ac546b50450d570a864d3e80d6aed5b0f2fac09427c5ddbb2d4ad1" exitCode=0 Nov 26 12:33:21 crc kubenswrapper[4834]: I1126 12:33:21.970401 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"acd09ade57ac546b50450d570a864d3e80d6aed5b0f2fac09427c5ddbb2d4ad1"} Nov 26 12:33:21 crc kubenswrapper[4834]: I1126 12:33:21.970484 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8"} Nov 26 12:33:21 crc kubenswrapper[4834]: I1126 12:33:21.970509 4834 scope.go:117] "RemoveContainer" containerID="5856a524b96de1a2b853af8b527df2165a8d3f4d203109c2d5ffa66afdcc60ed" Nov 26 12:33:44 crc kubenswrapper[4834]: I1126 12:33:44.302487 4834 scope.go:117] "RemoveContainer" containerID="0af6fbc528da14750d78fc5ae6a22c310a7be054dbb6d32f6e4fc4b8fa853e8e" Nov 26 12:33:44 crc kubenswrapper[4834]: I1126 12:33:44.328780 4834 scope.go:117] "RemoveContainer" containerID="90c5dbe11347bc7e18488697192007690a99c7ecc9f00c441a2f7ffb56c79440" Nov 26 12:34:15 crc kubenswrapper[4834]: I1126 12:34:15.435022 4834 generic.go:334] "Generic (PLEG): container finished" podID="8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c" containerID="1f0922cc64244aa912e289bebb22a03b4667f43dcd58cc0a33950ad8aecfe95f" exitCode=0 Nov 26 12:34:15 crc kubenswrapper[4834]: I1126 12:34:15.435133 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" event={"ID":"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c","Type":"ContainerDied","Data":"1f0922cc64244aa912e289bebb22a03b4667f43dcd58cc0a33950ad8aecfe95f"} Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.759846 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.858182 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-inventory\") pod \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.858280 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccwk6\" (UniqueName: \"kubernetes.io/projected/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-kube-api-access-ccwk6\") pod \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.858341 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-ssh-key\") pod \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\" (UID: \"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c\") " Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.865082 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-kube-api-access-ccwk6" (OuterVolumeSpecName: "kube-api-access-ccwk6") pod "8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c" (UID: "8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c"). InnerVolumeSpecName "kube-api-access-ccwk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.882036 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c" (UID: "8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.882933 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-inventory" (OuterVolumeSpecName: "inventory") pod "8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c" (UID: "8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.961622 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccwk6\" (UniqueName: \"kubernetes.io/projected/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-kube-api-access-ccwk6\") on node \"crc\" DevicePath \"\"" Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.961894 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:34:16 crc kubenswrapper[4834]: I1126 12:34:16.961906 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.454464 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" event={"ID":"8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c","Type":"ContainerDied","Data":"7874d8a01549d7c1c977366d674afc62e7b946379df29eea6c7f46beb1347165"} Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.454511 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7874d8a01549d7c1c977366d674afc62e7b946379df29eea6c7f46beb1347165" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.454515 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.532592 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz"] Nov 26 12:34:17 crc kubenswrapper[4834]: E1126 12:34:17.533072 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.533095 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.533357 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.534058 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.536032 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.536162 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.537391 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.537501 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.540071 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz"] Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.679432 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.679837 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.679953 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfbxq\" (UniqueName: \"kubernetes.io/projected/442421c1-5218-4074-8ee3-6673c9e16308-kube-api-access-wfbxq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.782957 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.784058 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfbxq\" (UniqueName: \"kubernetes.io/projected/442421c1-5218-4074-8ee3-6673c9e16308-kube-api-access-wfbxq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.784256 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.787701 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.787856 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.799630 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfbxq\" (UniqueName: \"kubernetes.io/projected/442421c1-5218-4074-8ee3-6673c9e16308-kube-api-access-wfbxq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:17 crc kubenswrapper[4834]: I1126 12:34:17.847512 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:18 crc kubenswrapper[4834]: I1126 12:34:18.330124 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz"] Nov 26 12:34:18 crc kubenswrapper[4834]: I1126 12:34:18.465618 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" event={"ID":"442421c1-5218-4074-8ee3-6673c9e16308","Type":"ContainerStarted","Data":"79062150b57398dbf6ae33b1169767cc30c0f9e1c597312c748ecad07cac9eac"} Nov 26 12:34:19 crc kubenswrapper[4834]: I1126 12:34:19.480457 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" event={"ID":"442421c1-5218-4074-8ee3-6673c9e16308","Type":"ContainerStarted","Data":"a54064b1949a728c4a3914fe325f342d908d377366c8e9bc5a601ff54c145873"} Nov 26 12:34:19 crc kubenswrapper[4834]: I1126 12:34:19.501413 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" podStartSLOduration=1.7869010360000002 podStartE2EDuration="2.501392892s" podCreationTimestamp="2025-11-26 12:34:17 +0000 UTC" firstStartedPulling="2025-11-26 12:34:18.333117975 +0000 UTC m=+1356.240331327" lastFinishedPulling="2025-11-26 12:34:19.047609831 +0000 UTC m=+1356.954823183" observedRunningTime="2025-11-26 12:34:19.496235676 +0000 UTC m=+1357.403449028" watchObservedRunningTime="2025-11-26 12:34:19.501392892 +0000 UTC m=+1357.408606244" Nov 26 12:34:23 crc kubenswrapper[4834]: I1126 12:34:23.520206 4834 generic.go:334] "Generic (PLEG): container finished" podID="442421c1-5218-4074-8ee3-6673c9e16308" containerID="a54064b1949a728c4a3914fe325f342d908d377366c8e9bc5a601ff54c145873" exitCode=0 Nov 26 12:34:23 crc kubenswrapper[4834]: I1126 12:34:23.520291 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" event={"ID":"442421c1-5218-4074-8ee3-6673c9e16308","Type":"ContainerDied","Data":"a54064b1949a728c4a3914fe325f342d908d377366c8e9bc5a601ff54c145873"} Nov 26 12:34:24 crc kubenswrapper[4834]: I1126 12:34:24.880795 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.030048 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-inventory\") pod \"442421c1-5218-4074-8ee3-6673c9e16308\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.030116 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfbxq\" (UniqueName: \"kubernetes.io/projected/442421c1-5218-4074-8ee3-6673c9e16308-kube-api-access-wfbxq\") pod \"442421c1-5218-4074-8ee3-6673c9e16308\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.030418 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-ssh-key\") pod \"442421c1-5218-4074-8ee3-6673c9e16308\" (UID: \"442421c1-5218-4074-8ee3-6673c9e16308\") " Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.036280 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442421c1-5218-4074-8ee3-6673c9e16308-kube-api-access-wfbxq" (OuterVolumeSpecName: "kube-api-access-wfbxq") pod "442421c1-5218-4074-8ee3-6673c9e16308" (UID: "442421c1-5218-4074-8ee3-6673c9e16308"). InnerVolumeSpecName "kube-api-access-wfbxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.053561 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "442421c1-5218-4074-8ee3-6673c9e16308" (UID: "442421c1-5218-4074-8ee3-6673c9e16308"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.053881 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-inventory" (OuterVolumeSpecName: "inventory") pod "442421c1-5218-4074-8ee3-6673c9e16308" (UID: "442421c1-5218-4074-8ee3-6673c9e16308"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.133242 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.133271 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfbxq\" (UniqueName: \"kubernetes.io/projected/442421c1-5218-4074-8ee3-6673c9e16308-kube-api-access-wfbxq\") on node \"crc\" DevicePath \"\"" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.133284 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/442421c1-5218-4074-8ee3-6673c9e16308-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.541587 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" event={"ID":"442421c1-5218-4074-8ee3-6673c9e16308","Type":"ContainerDied","Data":"79062150b57398dbf6ae33b1169767cc30c0f9e1c597312c748ecad07cac9eac"} Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.542050 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79062150b57398dbf6ae33b1169767cc30c0f9e1c597312c748ecad07cac9eac" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.541689 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.605531 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8"] Nov 26 12:34:25 crc kubenswrapper[4834]: E1126 12:34:25.606037 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442421c1-5218-4074-8ee3-6673c9e16308" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.606060 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="442421c1-5218-4074-8ee3-6673c9e16308" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.606236 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="442421c1-5218-4074-8ee3-6673c9e16308" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.606946 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.608701 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.608825 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.612863 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.614263 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.614510 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8"] Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.744798 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64nd8\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.744853 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64nd8\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.744931 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6lpn\" (UniqueName: \"kubernetes.io/projected/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-kube-api-access-x6lpn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64nd8\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.847045 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6lpn\" (UniqueName: \"kubernetes.io/projected/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-kube-api-access-x6lpn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64nd8\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.847692 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64nd8\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.847745 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64nd8\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.851859 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64nd8\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.852085 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64nd8\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.863863 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6lpn\" (UniqueName: \"kubernetes.io/projected/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-kube-api-access-x6lpn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64nd8\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:25 crc kubenswrapper[4834]: I1126 12:34:25.930121 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:26 crc kubenswrapper[4834]: I1126 12:34:26.380582 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8"] Nov 26 12:34:26 crc kubenswrapper[4834]: I1126 12:34:26.550356 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" event={"ID":"d204d6c9-614d-4ee7-9e82-e4d3b2402a43","Type":"ContainerStarted","Data":"175abf97819e76b684b06260b7340a90aa31bba31abcf18d46482ff94718f7fc"} Nov 26 12:34:27 crc kubenswrapper[4834]: I1126 12:34:27.563526 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" event={"ID":"d204d6c9-614d-4ee7-9e82-e4d3b2402a43","Type":"ContainerStarted","Data":"52ca1883f082639a9a510e3cfa4438a410b3f7514f960125f7adea0d09769623"} Nov 26 12:34:27 crc kubenswrapper[4834]: I1126 12:34:27.581620 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" podStartSLOduration=1.9332152809999998 podStartE2EDuration="2.581606037s" podCreationTimestamp="2025-11-26 12:34:25 +0000 UTC" firstStartedPulling="2025-11-26 12:34:26.390234943 +0000 UTC m=+1364.297448294" lastFinishedPulling="2025-11-26 12:34:27.038625697 +0000 UTC m=+1364.945839050" observedRunningTime="2025-11-26 12:34:27.579126 +0000 UTC m=+1365.486339352" watchObservedRunningTime="2025-11-26 12:34:27.581606037 +0000 UTC m=+1365.488819389" Nov 26 12:34:44 crc kubenswrapper[4834]: I1126 12:34:44.381848 4834 scope.go:117] "RemoveContainer" containerID="c1a217ed1cc95559d2b354fb883f6617944de533db0bf19f912d1fb2545338bd" Nov 26 12:34:44 crc kubenswrapper[4834]: I1126 12:34:44.407427 4834 scope.go:117] "RemoveContainer" containerID="61b452d4a80ec245831983ced0fcc0a0c4e85024681ad711651cb863513b1904" Nov 26 12:34:44 crc kubenswrapper[4834]: I1126 12:34:44.449075 4834 scope.go:117] "RemoveContainer" containerID="5621be9dfa21086b1a61bc8893995cd7d88c61eb94cf980cb393cbadd0e95612" Nov 26 12:34:44 crc kubenswrapper[4834]: I1126 12:34:44.485968 4834 scope.go:117] "RemoveContainer" containerID="12f67a3e666e156e51fd52e8017a07107066c703e0a539b1fbca75516d66a312" Nov 26 12:34:53 crc kubenswrapper[4834]: I1126 12:34:53.813466 4834 generic.go:334] "Generic (PLEG): container finished" podID="d204d6c9-614d-4ee7-9e82-e4d3b2402a43" containerID="52ca1883f082639a9a510e3cfa4438a410b3f7514f960125f7adea0d09769623" exitCode=0 Nov 26 12:34:53 crc kubenswrapper[4834]: I1126 12:34:53.813556 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" event={"ID":"d204d6c9-614d-4ee7-9e82-e4d3b2402a43","Type":"ContainerDied","Data":"52ca1883f082639a9a510e3cfa4438a410b3f7514f960125f7adea0d09769623"} Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.114077 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.239918 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-inventory\") pod \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.239990 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6lpn\" (UniqueName: \"kubernetes.io/projected/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-kube-api-access-x6lpn\") pod \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.240016 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-ssh-key\") pod \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\" (UID: \"d204d6c9-614d-4ee7-9e82-e4d3b2402a43\") " Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.244884 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-kube-api-access-x6lpn" (OuterVolumeSpecName: "kube-api-access-x6lpn") pod "d204d6c9-614d-4ee7-9e82-e4d3b2402a43" (UID: "d204d6c9-614d-4ee7-9e82-e4d3b2402a43"). InnerVolumeSpecName "kube-api-access-x6lpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.261441 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d204d6c9-614d-4ee7-9e82-e4d3b2402a43" (UID: "d204d6c9-614d-4ee7-9e82-e4d3b2402a43"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.261743 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-inventory" (OuterVolumeSpecName: "inventory") pod "d204d6c9-614d-4ee7-9e82-e4d3b2402a43" (UID: "d204d6c9-614d-4ee7-9e82-e4d3b2402a43"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.342169 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.342196 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6lpn\" (UniqueName: \"kubernetes.io/projected/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-kube-api-access-x6lpn\") on node \"crc\" DevicePath \"\"" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.342207 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d204d6c9-614d-4ee7-9e82-e4d3b2402a43-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.828751 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" event={"ID":"d204d6c9-614d-4ee7-9e82-e4d3b2402a43","Type":"ContainerDied","Data":"175abf97819e76b684b06260b7340a90aa31bba31abcf18d46482ff94718f7fc"} Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.828786 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="175abf97819e76b684b06260b7340a90aa31bba31abcf18d46482ff94718f7fc" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.828806 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.889870 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj"] Nov 26 12:34:55 crc kubenswrapper[4834]: E1126 12:34:55.890267 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d204d6c9-614d-4ee7-9e82-e4d3b2402a43" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.890285 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d204d6c9-614d-4ee7-9e82-e4d3b2402a43" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.890487 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d204d6c9-614d-4ee7-9e82-e4d3b2402a43" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.891106 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.894110 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.894140 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.894253 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.897880 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.900089 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj"] Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.952718 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6pc\" (UniqueName: \"kubernetes.io/projected/798933d4-3b38-48e2-8e14-0272d7daf788-kube-api-access-qk6pc\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.952793 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:55 crc kubenswrapper[4834]: I1126 12:34:55.953000 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:56 crc kubenswrapper[4834]: I1126 12:34:56.053863 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6pc\" (UniqueName: \"kubernetes.io/projected/798933d4-3b38-48e2-8e14-0272d7daf788-kube-api-access-qk6pc\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:56 crc kubenswrapper[4834]: I1126 12:34:56.054103 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:56 crc kubenswrapper[4834]: I1126 12:34:56.054148 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:56 crc kubenswrapper[4834]: I1126 12:34:56.057249 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:56 crc kubenswrapper[4834]: I1126 12:34:56.057796 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:56 crc kubenswrapper[4834]: I1126 12:34:56.068166 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6pc\" (UniqueName: \"kubernetes.io/projected/798933d4-3b38-48e2-8e14-0272d7daf788-kube-api-access-qk6pc\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:56 crc kubenswrapper[4834]: I1126 12:34:56.206703 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:34:56 crc kubenswrapper[4834]: I1126 12:34:56.652744 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj"] Nov 26 12:34:56 crc kubenswrapper[4834]: I1126 12:34:56.837875 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" event={"ID":"798933d4-3b38-48e2-8e14-0272d7daf788","Type":"ContainerStarted","Data":"08d0bf7118218938463a4c6cc214318557e4b90b39c67d3160d132ad86904d59"} Nov 26 12:34:57 crc kubenswrapper[4834]: I1126 12:34:57.845911 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" event={"ID":"798933d4-3b38-48e2-8e14-0272d7daf788","Type":"ContainerStarted","Data":"337833c4d8253d2415960a7c25de7e080586b7e3e49eedafc9da223167448d74"} Nov 26 12:34:57 crc kubenswrapper[4834]: I1126 12:34:57.861631 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" podStartSLOduration=2.294315777 podStartE2EDuration="2.861616193s" podCreationTimestamp="2025-11-26 12:34:55 +0000 UTC" firstStartedPulling="2025-11-26 12:34:56.661211678 +0000 UTC m=+1394.568425030" lastFinishedPulling="2025-11-26 12:34:57.228512094 +0000 UTC m=+1395.135725446" observedRunningTime="2025-11-26 12:34:57.859262033 +0000 UTC m=+1395.766475385" watchObservedRunningTime="2025-11-26 12:34:57.861616193 +0000 UTC m=+1395.768829546" Nov 26 12:35:00 crc kubenswrapper[4834]: I1126 12:35:00.867249 4834 generic.go:334] "Generic (PLEG): container finished" podID="798933d4-3b38-48e2-8e14-0272d7daf788" containerID="337833c4d8253d2415960a7c25de7e080586b7e3e49eedafc9da223167448d74" exitCode=0 Nov 26 12:35:00 crc kubenswrapper[4834]: I1126 12:35:00.867351 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" event={"ID":"798933d4-3b38-48e2-8e14-0272d7daf788","Type":"ContainerDied","Data":"337833c4d8253d2415960a7c25de7e080586b7e3e49eedafc9da223167448d74"} Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.159216 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.356962 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk6pc\" (UniqueName: \"kubernetes.io/projected/798933d4-3b38-48e2-8e14-0272d7daf788-kube-api-access-qk6pc\") pod \"798933d4-3b38-48e2-8e14-0272d7daf788\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.357282 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-inventory\") pod \"798933d4-3b38-48e2-8e14-0272d7daf788\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.357323 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-ssh-key\") pod \"798933d4-3b38-48e2-8e14-0272d7daf788\" (UID: \"798933d4-3b38-48e2-8e14-0272d7daf788\") " Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.364983 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798933d4-3b38-48e2-8e14-0272d7daf788-kube-api-access-qk6pc" (OuterVolumeSpecName: "kube-api-access-qk6pc") pod "798933d4-3b38-48e2-8e14-0272d7daf788" (UID: "798933d4-3b38-48e2-8e14-0272d7daf788"). InnerVolumeSpecName "kube-api-access-qk6pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.378186 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-inventory" (OuterVolumeSpecName: "inventory") pod "798933d4-3b38-48e2-8e14-0272d7daf788" (UID: "798933d4-3b38-48e2-8e14-0272d7daf788"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.378204 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "798933d4-3b38-48e2-8e14-0272d7daf788" (UID: "798933d4-3b38-48e2-8e14-0272d7daf788"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.459570 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk6pc\" (UniqueName: \"kubernetes.io/projected/798933d4-3b38-48e2-8e14-0272d7daf788-kube-api-access-qk6pc\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.459599 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.459687 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/798933d4-3b38-48e2-8e14-0272d7daf788-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.883332 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" event={"ID":"798933d4-3b38-48e2-8e14-0272d7daf788","Type":"ContainerDied","Data":"08d0bf7118218938463a4c6cc214318557e4b90b39c67d3160d132ad86904d59"} Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.883380 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08d0bf7118218938463a4c6cc214318557e4b90b39c67d3160d132ad86904d59" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.883429 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.926234 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf"] Nov 26 12:35:02 crc kubenswrapper[4834]: E1126 12:35:02.926632 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798933d4-3b38-48e2-8e14-0272d7daf788" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.926649 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="798933d4-3b38-48e2-8e14-0272d7daf788" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.926815 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="798933d4-3b38-48e2-8e14-0272d7daf788" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.927415 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.928904 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.928995 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.929191 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.929396 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.933034 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf"] Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.967077 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lnfv\" (UniqueName: \"kubernetes.io/projected/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-kube-api-access-9lnfv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.967152 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:02 crc kubenswrapper[4834]: I1126 12:35:02.967207 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:03 crc kubenswrapper[4834]: I1126 12:35:03.068015 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lnfv\" (UniqueName: \"kubernetes.io/projected/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-kube-api-access-9lnfv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:03 crc kubenswrapper[4834]: I1126 12:35:03.068074 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:03 crc kubenswrapper[4834]: I1126 12:35:03.068097 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:03 crc kubenswrapper[4834]: I1126 12:35:03.072120 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:03 crc kubenswrapper[4834]: I1126 12:35:03.072369 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:03 crc kubenswrapper[4834]: I1126 12:35:03.081399 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lnfv\" (UniqueName: \"kubernetes.io/projected/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-kube-api-access-9lnfv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:03 crc kubenswrapper[4834]: I1126 12:35:03.245605 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:03 crc kubenswrapper[4834]: I1126 12:35:03.690499 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf"] Nov 26 12:35:03 crc kubenswrapper[4834]: I1126 12:35:03.899811 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" event={"ID":"4a6f6d49-1d40-44e3-8e84-25ea8a06743e","Type":"ContainerStarted","Data":"c3fd285e66ac9a3effb3940352e8cd3ca5db93145c03f4f96846da667a2351cf"} Nov 26 12:35:04 crc kubenswrapper[4834]: I1126 12:35:04.922578 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" event={"ID":"4a6f6d49-1d40-44e3-8e84-25ea8a06743e","Type":"ContainerStarted","Data":"d4e556eb1e52bdad84e09095b481e032082bc42bc2beb56913e1b84939fbf3fa"} Nov 26 12:35:04 crc kubenswrapper[4834]: I1126 12:35:04.942392 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" podStartSLOduration=2.387429196 podStartE2EDuration="2.942379871s" podCreationTimestamp="2025-11-26 12:35:02 +0000 UTC" firstStartedPulling="2025-11-26 12:35:03.695133294 +0000 UTC m=+1401.602346646" lastFinishedPulling="2025-11-26 12:35:04.250083969 +0000 UTC m=+1402.157297321" observedRunningTime="2025-11-26 12:35:04.940915801 +0000 UTC m=+1402.848129153" watchObservedRunningTime="2025-11-26 12:35:04.942379871 +0000 UTC m=+1402.849593223" Nov 26 12:35:21 crc kubenswrapper[4834]: I1126 12:35:21.530812 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:35:21 crc kubenswrapper[4834]: I1126 12:35:21.531206 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:35:26 crc kubenswrapper[4834]: I1126 12:35:26.033713 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4da5-account-create-update-46kvg"] Nov 26 12:35:26 crc kubenswrapper[4834]: I1126 12:35:26.041060 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4da5-account-create-update-46kvg"] Nov 26 12:35:26 crc kubenswrapper[4834]: I1126 12:35:26.424793 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a043f350-a471-483b-aa31-117d66b38cf5" path="/var/lib/kubelet/pods/a043f350-a471-483b-aa31-117d66b38cf5/volumes" Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.034931 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7fa5-account-create-update-t2rx7"] Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.047001 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c888-account-create-update-txj78"] Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.055979 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rzjvj"] Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.062617 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-wm9lb"] Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.067985 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c888-account-create-update-txj78"] Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.083856 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7fa5-account-create-update-t2rx7"] Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.088217 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8lrs8"] Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.098973 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-wm9lb"] Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.107984 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8lrs8"] Nov 26 12:35:27 crc kubenswrapper[4834]: I1126 12:35:27.114378 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rzjvj"] Nov 26 12:35:28 crc kubenswrapper[4834]: I1126 12:35:28.427121 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e8838f6-8981-4f7d-a871-f09435bfc1ee" path="/var/lib/kubelet/pods/0e8838f6-8981-4f7d-a871-f09435bfc1ee/volumes" Nov 26 12:35:28 crc kubenswrapper[4834]: I1126 12:35:28.428266 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f18cba3-901e-45d3-9f1a-a04a17fe1b4d" path="/var/lib/kubelet/pods/0f18cba3-901e-45d3-9f1a-a04a17fe1b4d/volumes" Nov 26 12:35:28 crc kubenswrapper[4834]: I1126 12:35:28.428917 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68884564-b059-4562-8410-956a50be744a" path="/var/lib/kubelet/pods/68884564-b059-4562-8410-956a50be744a/volumes" Nov 26 12:35:28 crc kubenswrapper[4834]: I1126 12:35:28.429570 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80b41954-7fc3-4e18-9fde-08323d1a5aa6" path="/var/lib/kubelet/pods/80b41954-7fc3-4e18-9fde-08323d1a5aa6/volumes" Nov 26 12:35:28 crc kubenswrapper[4834]: I1126 12:35:28.430734 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befcee03-7239-4d9e-915d-7fa7f9ebfb44" path="/var/lib/kubelet/pods/befcee03-7239-4d9e-915d-7fa7f9ebfb44/volumes" Nov 26 12:35:39 crc kubenswrapper[4834]: I1126 12:35:39.181737 4834 generic.go:334] "Generic (PLEG): container finished" podID="4a6f6d49-1d40-44e3-8e84-25ea8a06743e" containerID="d4e556eb1e52bdad84e09095b481e032082bc42bc2beb56913e1b84939fbf3fa" exitCode=0 Nov 26 12:35:39 crc kubenswrapper[4834]: I1126 12:35:39.181816 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" event={"ID":"4a6f6d49-1d40-44e3-8e84-25ea8a06743e","Type":"ContainerDied","Data":"d4e556eb1e52bdad84e09095b481e032082bc42bc2beb56913e1b84939fbf3fa"} Nov 26 12:35:40 crc kubenswrapper[4834]: I1126 12:35:40.508300 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:40 crc kubenswrapper[4834]: I1126 12:35:40.621593 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-inventory\") pod \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " Nov 26 12:35:40 crc kubenswrapper[4834]: I1126 12:35:40.621901 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lnfv\" (UniqueName: \"kubernetes.io/projected/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-kube-api-access-9lnfv\") pod \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " Nov 26 12:35:40 crc kubenswrapper[4834]: I1126 12:35:40.622094 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-ssh-key\") pod \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\" (UID: \"4a6f6d49-1d40-44e3-8e84-25ea8a06743e\") " Nov 26 12:35:40 crc kubenswrapper[4834]: I1126 12:35:40.626345 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-kube-api-access-9lnfv" (OuterVolumeSpecName: "kube-api-access-9lnfv") pod "4a6f6d49-1d40-44e3-8e84-25ea8a06743e" (UID: "4a6f6d49-1d40-44e3-8e84-25ea8a06743e"). InnerVolumeSpecName "kube-api-access-9lnfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:35:40 crc kubenswrapper[4834]: I1126 12:35:40.644179 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-inventory" (OuterVolumeSpecName: "inventory") pod "4a6f6d49-1d40-44e3-8e84-25ea8a06743e" (UID: "4a6f6d49-1d40-44e3-8e84-25ea8a06743e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:35:40 crc kubenswrapper[4834]: I1126 12:35:40.644961 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4a6f6d49-1d40-44e3-8e84-25ea8a06743e" (UID: "4a6f6d49-1d40-44e3-8e84-25ea8a06743e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:40.724256 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:40.724287 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:40.724298 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lnfv\" (UniqueName: \"kubernetes.io/projected/4a6f6d49-1d40-44e3-8e84-25ea8a06743e-kube-api-access-9lnfv\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.200845 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" event={"ID":"4a6f6d49-1d40-44e3-8e84-25ea8a06743e","Type":"ContainerDied","Data":"c3fd285e66ac9a3effb3940352e8cd3ca5db93145c03f4f96846da667a2351cf"} Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.200924 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3fd285e66ac9a3effb3940352e8cd3ca5db93145c03f4f96846da667a2351cf" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.200941 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.262325 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pbfgn"] Nov 26 12:35:41 crc kubenswrapper[4834]: E1126 12:35:41.262845 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6f6d49-1d40-44e3-8e84-25ea8a06743e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.262864 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6f6d49-1d40-44e3-8e84-25ea8a06743e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.263052 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6f6d49-1d40-44e3-8e84-25ea8a06743e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.263772 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.265681 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.265686 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.265926 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.266706 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.268657 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pbfgn"] Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.337717 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pbfgn\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.337830 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pbfgn\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.337966 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msvmr\" (UniqueName: \"kubernetes.io/projected/3070833c-b52d-4d58-baf2-76bbf8a315b3-kube-api-access-msvmr\") pod \"ssh-known-hosts-edpm-deployment-pbfgn\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.439275 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pbfgn\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.439367 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pbfgn\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.439450 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msvmr\" (UniqueName: \"kubernetes.io/projected/3070833c-b52d-4d58-baf2-76bbf8a315b3-kube-api-access-msvmr\") pod \"ssh-known-hosts-edpm-deployment-pbfgn\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.442538 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pbfgn\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.442641 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pbfgn\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.453032 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msvmr\" (UniqueName: \"kubernetes.io/projected/3070833c-b52d-4d58-baf2-76bbf8a315b3-kube-api-access-msvmr\") pod \"ssh-known-hosts-edpm-deployment-pbfgn\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:41 crc kubenswrapper[4834]: I1126 12:35:41.578893 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:42 crc kubenswrapper[4834]: I1126 12:35:42.021340 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pbfgn"] Nov 26 12:35:42 crc kubenswrapper[4834]: W1126 12:35:42.023342 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3070833c_b52d_4d58_baf2_76bbf8a315b3.slice/crio-e58b1a88aacc489dc83f4ee45aff2510d2503de230658b468513da57b539aa3f WatchSource:0}: Error finding container e58b1a88aacc489dc83f4ee45aff2510d2503de230658b468513da57b539aa3f: Status 404 returned error can't find the container with id e58b1a88aacc489dc83f4ee45aff2510d2503de230658b468513da57b539aa3f Nov 26 12:35:42 crc kubenswrapper[4834]: I1126 12:35:42.025599 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 12:35:42 crc kubenswrapper[4834]: I1126 12:35:42.209719 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" event={"ID":"3070833c-b52d-4d58-baf2-76bbf8a315b3","Type":"ContainerStarted","Data":"e58b1a88aacc489dc83f4ee45aff2510d2503de230658b468513da57b539aa3f"} Nov 26 12:35:42 crc kubenswrapper[4834]: I1126 12:35:42.665576 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:35:43 crc kubenswrapper[4834]: I1126 12:35:43.218931 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" event={"ID":"3070833c-b52d-4d58-baf2-76bbf8a315b3","Type":"ContainerStarted","Data":"7bbbd9de0f4ed12dbf67a0fe6bcaddcf6aaf829b68393b03879f5c2ef6d7083a"} Nov 26 12:35:43 crc kubenswrapper[4834]: I1126 12:35:43.239298 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" podStartSLOduration=1.6014794129999999 podStartE2EDuration="2.239284418s" podCreationTimestamp="2025-11-26 12:35:41 +0000 UTC" firstStartedPulling="2025-11-26 12:35:42.025280726 +0000 UTC m=+1439.932494078" lastFinishedPulling="2025-11-26 12:35:42.663085731 +0000 UTC m=+1440.570299083" observedRunningTime="2025-11-26 12:35:43.232553354 +0000 UTC m=+1441.139766707" watchObservedRunningTime="2025-11-26 12:35:43.239284418 +0000 UTC m=+1441.146497771" Nov 26 12:35:44 crc kubenswrapper[4834]: I1126 12:35:44.534569 4834 scope.go:117] "RemoveContainer" containerID="247a86de535a490d43352f9c5b84cb2fc502769f25f3d8927c150f8d508d4953" Nov 26 12:35:44 crc kubenswrapper[4834]: I1126 12:35:44.553215 4834 scope.go:117] "RemoveContainer" containerID="37f4d14d3e94a1f32e6c5e55c04ebd2f89f021bc9738852546082a8228e3df1c" Nov 26 12:35:44 crc kubenswrapper[4834]: I1126 12:35:44.584763 4834 scope.go:117] "RemoveContainer" containerID="6f6cfe17b022b47b23e8855789246d181a4f11dbd89b7f79281fc8c0769219ed" Nov 26 12:35:44 crc kubenswrapper[4834]: I1126 12:35:44.612234 4834 scope.go:117] "RemoveContainer" containerID="feb7821555cf800b22fd3e1c2858ed939d9778299f04c6d7d31105857f017a3d" Nov 26 12:35:44 crc kubenswrapper[4834]: I1126 12:35:44.637861 4834 scope.go:117] "RemoveContainer" containerID="16f70ec28c0dc21712e2ff1513c0b545325790bc8041d8b96fc08c491861e475" Nov 26 12:35:44 crc kubenswrapper[4834]: I1126 12:35:44.664115 4834 scope.go:117] "RemoveContainer" containerID="d901867de9f0e91087afc5ead70be4e5c65daae684033fa9aeb243f352949716" Nov 26 12:35:46 crc kubenswrapper[4834]: I1126 12:35:46.034348 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-t2gkm"] Nov 26 12:35:46 crc kubenswrapper[4834]: I1126 12:35:46.041495 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-t2gkm"] Nov 26 12:35:46 crc kubenswrapper[4834]: I1126 12:35:46.425607 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d56ce8c-1412-4272-8905-e251251f4f64" path="/var/lib/kubelet/pods/6d56ce8c-1412-4272-8905-e251251f4f64/volumes" Nov 26 12:35:48 crc kubenswrapper[4834]: I1126 12:35:48.255555 4834 generic.go:334] "Generic (PLEG): container finished" podID="3070833c-b52d-4d58-baf2-76bbf8a315b3" containerID="7bbbd9de0f4ed12dbf67a0fe6bcaddcf6aaf829b68393b03879f5c2ef6d7083a" exitCode=0 Nov 26 12:35:48 crc kubenswrapper[4834]: I1126 12:35:48.255595 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" event={"ID":"3070833c-b52d-4d58-baf2-76bbf8a315b3","Type":"ContainerDied","Data":"7bbbd9de0f4ed12dbf67a0fe6bcaddcf6aaf829b68393b03879f5c2ef6d7083a"} Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.550447 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.677897 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-inventory-0\") pod \"3070833c-b52d-4d58-baf2-76bbf8a315b3\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.678060 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msvmr\" (UniqueName: \"kubernetes.io/projected/3070833c-b52d-4d58-baf2-76bbf8a315b3-kube-api-access-msvmr\") pod \"3070833c-b52d-4d58-baf2-76bbf8a315b3\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.678092 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-ssh-key-openstack-edpm-ipam\") pod \"3070833c-b52d-4d58-baf2-76bbf8a315b3\" (UID: \"3070833c-b52d-4d58-baf2-76bbf8a315b3\") " Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.682196 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3070833c-b52d-4d58-baf2-76bbf8a315b3-kube-api-access-msvmr" (OuterVolumeSpecName: "kube-api-access-msvmr") pod "3070833c-b52d-4d58-baf2-76bbf8a315b3" (UID: "3070833c-b52d-4d58-baf2-76bbf8a315b3"). InnerVolumeSpecName "kube-api-access-msvmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.698639 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3070833c-b52d-4d58-baf2-76bbf8a315b3" (UID: "3070833c-b52d-4d58-baf2-76bbf8a315b3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.698949 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3070833c-b52d-4d58-baf2-76bbf8a315b3" (UID: "3070833c-b52d-4d58-baf2-76bbf8a315b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.780095 4834 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.780181 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msvmr\" (UniqueName: \"kubernetes.io/projected/3070833c-b52d-4d58-baf2-76bbf8a315b3-kube-api-access-msvmr\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:49 crc kubenswrapper[4834]: I1126 12:35:49.780236 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3070833c-b52d-4d58-baf2-76bbf8a315b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.271518 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" event={"ID":"3070833c-b52d-4d58-baf2-76bbf8a315b3","Type":"ContainerDied","Data":"e58b1a88aacc489dc83f4ee45aff2510d2503de230658b468513da57b539aa3f"} Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.271553 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pbfgn" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.271559 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e58b1a88aacc489dc83f4ee45aff2510d2503de230658b468513da57b539aa3f" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.320911 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h"] Nov 26 12:35:50 crc kubenswrapper[4834]: E1126 12:35:50.321416 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3070833c-b52d-4d58-baf2-76bbf8a315b3" containerName="ssh-known-hosts-edpm-deployment" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.321477 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3070833c-b52d-4d58-baf2-76bbf8a315b3" containerName="ssh-known-hosts-edpm-deployment" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.321705 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3070833c-b52d-4d58-baf2-76bbf8a315b3" containerName="ssh-known-hosts-edpm-deployment" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.322324 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.325394 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.326353 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.326466 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.327144 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.332126 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h"] Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.492420 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bq88h\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.492464 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwp5t\" (UniqueName: \"kubernetes.io/projected/cc7c233f-2fa5-4441-a8e2-229bc771e093-kube-api-access-hwp5t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bq88h\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.492492 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bq88h\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.594442 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bq88h\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.594480 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwp5t\" (UniqueName: \"kubernetes.io/projected/cc7c233f-2fa5-4441-a8e2-229bc771e093-kube-api-access-hwp5t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bq88h\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.594509 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bq88h\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.598606 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bq88h\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.599380 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bq88h\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.608503 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwp5t\" (UniqueName: \"kubernetes.io/projected/cc7c233f-2fa5-4441-a8e2-229bc771e093-kube-api-access-hwp5t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bq88h\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:50 crc kubenswrapper[4834]: I1126 12:35:50.635923 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:51 crc kubenswrapper[4834]: I1126 12:35:51.078505 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h"] Nov 26 12:35:51 crc kubenswrapper[4834]: I1126 12:35:51.280821 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" event={"ID":"cc7c233f-2fa5-4441-a8e2-229bc771e093","Type":"ContainerStarted","Data":"bb1bc4fb2113ec9a871643e29aeb16c638babba4ef799c60c0a7f3528cc04d5d"} Nov 26 12:35:51 crc kubenswrapper[4834]: I1126 12:35:51.531137 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:35:51 crc kubenswrapper[4834]: I1126 12:35:51.531187 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:35:52 crc kubenswrapper[4834]: I1126 12:35:52.290020 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" event={"ID":"cc7c233f-2fa5-4441-a8e2-229bc771e093","Type":"ContainerStarted","Data":"ab3e8d77415d1f722a91daaa236954cb3c1a052e5e797bc4c56470d258e86cdb"} Nov 26 12:35:52 crc kubenswrapper[4834]: I1126 12:35:52.310424 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" podStartSLOduration=1.7939331539999999 podStartE2EDuration="2.310408837s" podCreationTimestamp="2025-11-26 12:35:50 +0000 UTC" firstStartedPulling="2025-11-26 12:35:51.089358926 +0000 UTC m=+1448.996572277" lastFinishedPulling="2025-11-26 12:35:51.605834608 +0000 UTC m=+1449.513047960" observedRunningTime="2025-11-26 12:35:52.303884763 +0000 UTC m=+1450.211098116" watchObservedRunningTime="2025-11-26 12:35:52.310408837 +0000 UTC m=+1450.217622190" Nov 26 12:35:57 crc kubenswrapper[4834]: I1126 12:35:57.023510 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qmhct"] Nov 26 12:35:57 crc kubenswrapper[4834]: I1126 12:35:57.033100 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0e4b-account-create-update-z6xkv"] Nov 26 12:35:57 crc kubenswrapper[4834]: I1126 12:35:57.041116 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-04bc-account-create-update-5k2d8"] Nov 26 12:35:57 crc kubenswrapper[4834]: I1126 12:35:57.048495 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qmhct"] Nov 26 12:35:57 crc kubenswrapper[4834]: I1126 12:35:57.053581 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-04bc-account-create-update-5k2d8"] Nov 26 12:35:57 crc kubenswrapper[4834]: I1126 12:35:57.058818 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0e4b-account-create-update-z6xkv"] Nov 26 12:35:57 crc kubenswrapper[4834]: I1126 12:35:57.328619 4834 generic.go:334] "Generic (PLEG): container finished" podID="cc7c233f-2fa5-4441-a8e2-229bc771e093" containerID="ab3e8d77415d1f722a91daaa236954cb3c1a052e5e797bc4c56470d258e86cdb" exitCode=0 Nov 26 12:35:57 crc kubenswrapper[4834]: I1126 12:35:57.328698 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" event={"ID":"cc7c233f-2fa5-4441-a8e2-229bc771e093","Type":"ContainerDied","Data":"ab3e8d77415d1f722a91daaa236954cb3c1a052e5e797bc4c56470d258e86cdb"} Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.023419 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d03a-account-create-update-2ttnl"] Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.031963 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7742r"] Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.037923 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8q49t"] Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.042983 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7742r"] Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.047814 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8q49t"] Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.052908 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d03a-account-create-update-2ttnl"] Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.428438 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6" path="/var/lib/kubelet/pods/09f0e4a6-71d0-45b3-b96f-2fdf28bbdbc6/volumes" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.428969 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ebb8de-ce70-4164-8375-945db41f55e4" path="/var/lib/kubelet/pods/16ebb8de-ce70-4164-8375-945db41f55e4/volumes" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.429496 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef" path="/var/lib/kubelet/pods/a3f7b0b8-9b1b-479c-ad1f-5f2d82b0acef/volumes" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.429977 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c681553c-78af-4a9a-bcf1-a03424706e78" path="/var/lib/kubelet/pods/c681553c-78af-4a9a-bcf1-a03424706e78/volumes" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.430893 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e912bd72-a05d-4456-92f8-42a5eca2621b" path="/var/lib/kubelet/pods/e912bd72-a05d-4456-92f8-42a5eca2621b/volumes" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.431587 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6" path="/var/lib/kubelet/pods/ff8842d2-f5fc-49bc-a2de-2b71d41fd6a6/volumes" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.633099 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.634782 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwp5t\" (UniqueName: \"kubernetes.io/projected/cc7c233f-2fa5-4441-a8e2-229bc771e093-kube-api-access-hwp5t\") pod \"cc7c233f-2fa5-4441-a8e2-229bc771e093\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.639139 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc7c233f-2fa5-4441-a8e2-229bc771e093-kube-api-access-hwp5t" (OuterVolumeSpecName: "kube-api-access-hwp5t") pod "cc7c233f-2fa5-4441-a8e2-229bc771e093" (UID: "cc7c233f-2fa5-4441-a8e2-229bc771e093"). InnerVolumeSpecName "kube-api-access-hwp5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.736510 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-inventory\") pod \"cc7c233f-2fa5-4441-a8e2-229bc771e093\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.736572 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-ssh-key\") pod \"cc7c233f-2fa5-4441-a8e2-229bc771e093\" (UID: \"cc7c233f-2fa5-4441-a8e2-229bc771e093\") " Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.736956 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwp5t\" (UniqueName: \"kubernetes.io/projected/cc7c233f-2fa5-4441-a8e2-229bc771e093-kube-api-access-hwp5t\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.755806 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-inventory" (OuterVolumeSpecName: "inventory") pod "cc7c233f-2fa5-4441-a8e2-229bc771e093" (UID: "cc7c233f-2fa5-4441-a8e2-229bc771e093"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.756390 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cc7c233f-2fa5-4441-a8e2-229bc771e093" (UID: "cc7c233f-2fa5-4441-a8e2-229bc771e093"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.839223 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:58 crc kubenswrapper[4834]: I1126 12:35:58.839249 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc7c233f-2fa5-4441-a8e2-229bc771e093-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.356215 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" event={"ID":"cc7c233f-2fa5-4441-a8e2-229bc771e093","Type":"ContainerDied","Data":"bb1bc4fb2113ec9a871643e29aeb16c638babba4ef799c60c0a7f3528cc04d5d"} Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.356430 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.359924 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1bc4fb2113ec9a871643e29aeb16c638babba4ef799c60c0a7f3528cc04d5d" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.402506 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg"] Nov 26 12:35:59 crc kubenswrapper[4834]: E1126 12:35:59.402964 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc7c233f-2fa5-4441-a8e2-229bc771e093" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.402984 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc7c233f-2fa5-4441-a8e2-229bc771e093" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.403159 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc7c233f-2fa5-4441-a8e2-229bc771e093" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.403861 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.412713 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.412758 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.412788 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.412867 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.414034 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg"] Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.557805 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.557901 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.557964 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4hx2\" (UniqueName: \"kubernetes.io/projected/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-kube-api-access-g4hx2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.659996 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.660098 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.660212 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4hx2\" (UniqueName: \"kubernetes.io/projected/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-kube-api-access-g4hx2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.663214 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.663632 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.674653 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4hx2\" (UniqueName: \"kubernetes.io/projected/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-kube-api-access-g4hx2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:35:59 crc kubenswrapper[4834]: I1126 12:35:59.725210 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:36:00 crc kubenswrapper[4834]: I1126 12:36:00.163090 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg"] Nov 26 12:36:00 crc kubenswrapper[4834]: I1126 12:36:00.365193 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" event={"ID":"fadb4daf-e9ae-4a04-a176-5bff2d64dea6","Type":"ContainerStarted","Data":"fd662fac1cbf41b6c12ffee175fd6f254552eed5ab8339ac66046015a5cd3ca7"} Nov 26 12:36:01 crc kubenswrapper[4834]: I1126 12:36:01.028065 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9p5t2"] Nov 26 12:36:01 crc kubenswrapper[4834]: I1126 12:36:01.033672 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9p5t2"] Nov 26 12:36:01 crc kubenswrapper[4834]: I1126 12:36:01.374061 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" event={"ID":"fadb4daf-e9ae-4a04-a176-5bff2d64dea6","Type":"ContainerStarted","Data":"e10a2ba759403c41fa78a601b1d66ac0faacccbe4b8e3fcab6a97c044ca3155c"} Nov 26 12:36:02 crc kubenswrapper[4834]: I1126 12:36:02.426227 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0e6cda-4565-4342-b83f-39df9cdd4207" path="/var/lib/kubelet/pods/1a0e6cda-4565-4342-b83f-39df9cdd4207/volumes" Nov 26 12:36:08 crc kubenswrapper[4834]: I1126 12:36:08.427934 4834 generic.go:334] "Generic (PLEG): container finished" podID="fadb4daf-e9ae-4a04-a176-5bff2d64dea6" containerID="e10a2ba759403c41fa78a601b1d66ac0faacccbe4b8e3fcab6a97c044ca3155c" exitCode=0 Nov 26 12:36:08 crc kubenswrapper[4834]: I1126 12:36:08.427999 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" event={"ID":"fadb4daf-e9ae-4a04-a176-5bff2d64dea6","Type":"ContainerDied","Data":"e10a2ba759403c41fa78a601b1d66ac0faacccbe4b8e3fcab6a97c044ca3155c"} Nov 26 12:36:09 crc kubenswrapper[4834]: I1126 12:36:09.725646 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:36:09 crc kubenswrapper[4834]: I1126 12:36:09.841557 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4hx2\" (UniqueName: \"kubernetes.io/projected/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-kube-api-access-g4hx2\") pod \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " Nov 26 12:36:09 crc kubenswrapper[4834]: I1126 12:36:09.841791 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-inventory\") pod \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " Nov 26 12:36:09 crc kubenswrapper[4834]: I1126 12:36:09.841872 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-ssh-key\") pod \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " Nov 26 12:36:09 crc kubenswrapper[4834]: I1126 12:36:09.847057 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-kube-api-access-g4hx2" (OuterVolumeSpecName: "kube-api-access-g4hx2") pod "fadb4daf-e9ae-4a04-a176-5bff2d64dea6" (UID: "fadb4daf-e9ae-4a04-a176-5bff2d64dea6"). InnerVolumeSpecName "kube-api-access-g4hx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:36:09 crc kubenswrapper[4834]: E1126 12:36:09.861015 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-inventory podName:fadb4daf-e9ae-4a04-a176-5bff2d64dea6 nodeName:}" failed. No retries permitted until 2025-11-26 12:36:10.360995714 +0000 UTC m=+1468.268209066 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-inventory") pod "fadb4daf-e9ae-4a04-a176-5bff2d64dea6" (UID: "fadb4daf-e9ae-4a04-a176-5bff2d64dea6") : error deleting /var/lib/kubelet/pods/fadb4daf-e9ae-4a04-a176-5bff2d64dea6/volume-subpaths: remove /var/lib/kubelet/pods/fadb4daf-e9ae-4a04-a176-5bff2d64dea6/volume-subpaths: no such file or directory Nov 26 12:36:09 crc kubenswrapper[4834]: I1126 12:36:09.863072 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fadb4daf-e9ae-4a04-a176-5bff2d64dea6" (UID: "fadb4daf-e9ae-4a04-a176-5bff2d64dea6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:36:09 crc kubenswrapper[4834]: I1126 12:36:09.943905 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:36:09 crc kubenswrapper[4834]: I1126 12:36:09.943930 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4hx2\" (UniqueName: \"kubernetes.io/projected/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-kube-api-access-g4hx2\") on node \"crc\" DevicePath \"\"" Nov 26 12:36:10 crc kubenswrapper[4834]: I1126 12:36:10.442393 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" event={"ID":"fadb4daf-e9ae-4a04-a176-5bff2d64dea6","Type":"ContainerDied","Data":"fd662fac1cbf41b6c12ffee175fd6f254552eed5ab8339ac66046015a5cd3ca7"} Nov 26 12:36:10 crc kubenswrapper[4834]: I1126 12:36:10.442430 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd662fac1cbf41b6c12ffee175fd6f254552eed5ab8339ac66046015a5cd3ca7" Nov 26 12:36:10 crc kubenswrapper[4834]: I1126 12:36:10.442439 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg" Nov 26 12:36:10 crc kubenswrapper[4834]: I1126 12:36:10.453221 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-inventory\") pod \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\" (UID: \"fadb4daf-e9ae-4a04-a176-5bff2d64dea6\") " Nov 26 12:36:10 crc kubenswrapper[4834]: I1126 12:36:10.457249 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-inventory" (OuterVolumeSpecName: "inventory") pod "fadb4daf-e9ae-4a04-a176-5bff2d64dea6" (UID: "fadb4daf-e9ae-4a04-a176-5bff2d64dea6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:36:10 crc kubenswrapper[4834]: I1126 12:36:10.556251 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fadb4daf-e9ae-4a04-a176-5bff2d64dea6-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:36:18 crc kubenswrapper[4834]: I1126 12:36:18.042589 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7lbgp"] Nov 26 12:36:18 crc kubenswrapper[4834]: I1126 12:36:18.049474 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7lbgp"] Nov 26 12:36:18 crc kubenswrapper[4834]: I1126 12:36:18.426357 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f95bd75-d740-47da-9ff2-d13cd8914aa4" path="/var/lib/kubelet/pods/9f95bd75-d740-47da-9ff2-d13cd8914aa4/volumes" Nov 26 12:36:21 crc kubenswrapper[4834]: I1126 12:36:21.531274 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:36:21 crc kubenswrapper[4834]: I1126 12:36:21.532201 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:36:21 crc kubenswrapper[4834]: I1126 12:36:21.532289 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:36:21 crc kubenswrapper[4834]: I1126 12:36:21.533098 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:36:21 crc kubenswrapper[4834]: I1126 12:36:21.533205 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" gracePeriod=600 Nov 26 12:36:21 crc kubenswrapper[4834]: E1126 12:36:21.648414 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:36:22 crc kubenswrapper[4834]: I1126 12:36:22.534349 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" exitCode=0 Nov 26 12:36:22 crc kubenswrapper[4834]: I1126 12:36:22.534430 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8"} Nov 26 12:36:22 crc kubenswrapper[4834]: I1126 12:36:22.534666 4834 scope.go:117] "RemoveContainer" containerID="acd09ade57ac546b50450d570a864d3e80d6aed5b0f2fac09427c5ddbb2d4ad1" Nov 26 12:36:22 crc kubenswrapper[4834]: I1126 12:36:22.535213 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:36:22 crc kubenswrapper[4834]: E1126 12:36:22.535607 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:36:32 crc kubenswrapper[4834]: I1126 12:36:32.023928 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lq9rd"] Nov 26 12:36:32 crc kubenswrapper[4834]: I1126 12:36:32.030571 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lq9rd"] Nov 26 12:36:32 crc kubenswrapper[4834]: I1126 12:36:32.424655 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53beea6d-dc20-4423-b1eb-ff58ff4c2c69" path="/var/lib/kubelet/pods/53beea6d-dc20-4423-b1eb-ff58ff4c2c69/volumes" Nov 26 12:36:35 crc kubenswrapper[4834]: I1126 12:36:35.021997 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6c9ml"] Nov 26 12:36:35 crc kubenswrapper[4834]: I1126 12:36:35.029801 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6c9ml"] Nov 26 12:36:35 crc kubenswrapper[4834]: I1126 12:36:35.037974 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-rjgqd"] Nov 26 12:36:35 crc kubenswrapper[4834]: I1126 12:36:35.043686 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-rjgqd"] Nov 26 12:36:36 crc kubenswrapper[4834]: I1126 12:36:36.426236 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eac617a-c5a8-44c6-b790-55ec23e59e5a" path="/var/lib/kubelet/pods/2eac617a-c5a8-44c6-b790-55ec23e59e5a/volumes" Nov 26 12:36:36 crc kubenswrapper[4834]: I1126 12:36:36.427755 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d96f008c-e967-4142-a433-92edb4634097" path="/var/lib/kubelet/pods/d96f008c-e967-4142-a433-92edb4634097/volumes" Nov 26 12:36:38 crc kubenswrapper[4834]: I1126 12:36:38.417574 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:36:38 crc kubenswrapper[4834]: E1126 12:36:38.417920 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:36:44 crc kubenswrapper[4834]: I1126 12:36:44.773046 4834 scope.go:117] "RemoveContainer" containerID="edf55806b92d1dad809694bfadccf8d9051ede3bfb157924b86fd74714adce77" Nov 26 12:36:44 crc kubenswrapper[4834]: I1126 12:36:44.794323 4834 scope.go:117] "RemoveContainer" containerID="32f6e03b9de3c2ed3c5798007a388c3d7cc327119438ee13cc0406f46ddd0c4b" Nov 26 12:36:44 crc kubenswrapper[4834]: I1126 12:36:44.823475 4834 scope.go:117] "RemoveContainer" containerID="a6dbc3a0a3a93678bfb749c6e279736cead11a4a8a7bc07dcef5a394ed4b721a" Nov 26 12:36:44 crc kubenswrapper[4834]: I1126 12:36:44.866626 4834 scope.go:117] "RemoveContainer" containerID="4c73203051b391700f57910c10d27f99f5acadaf3336763b2e0f10478367f55a" Nov 26 12:36:44 crc kubenswrapper[4834]: I1126 12:36:44.882225 4834 scope.go:117] "RemoveContainer" containerID="307f633476b01b2a97375072d9c897b59d86ac5e923e230b01a11b8413ad965d" Nov 26 12:36:44 crc kubenswrapper[4834]: I1126 12:36:44.918243 4834 scope.go:117] "RemoveContainer" containerID="2bf4127e75887f916ec9f7fb22b34b8a25180ec24233c82c3eb8867ad84444f1" Nov 26 12:36:44 crc kubenswrapper[4834]: I1126 12:36:44.941659 4834 scope.go:117] "RemoveContainer" containerID="02bc616cb82e30421c00aa45067e6c0f25fc87c2fbf7ff8078601aae94379266" Nov 26 12:36:44 crc kubenswrapper[4834]: I1126 12:36:44.962902 4834 scope.go:117] "RemoveContainer" containerID="36dc29a3b278be0dc496b3e00caf0131be7692570972dc4ef80807eb90cf6b8b" Nov 26 12:36:44 crc kubenswrapper[4834]: I1126 12:36:44.986174 4834 scope.go:117] "RemoveContainer" containerID="0d0e0711aeb57d676157a6ba95bf757efc273fdf0bed1f12bcd6556a7998ef13" Nov 26 12:36:45 crc kubenswrapper[4834]: I1126 12:36:45.000668 4834 scope.go:117] "RemoveContainer" containerID="b0909226f01908f9f2ded595fced81eb1c80fe80c1f5e098da1d11534c07302e" Nov 26 12:36:45 crc kubenswrapper[4834]: I1126 12:36:45.029336 4834 scope.go:117] "RemoveContainer" containerID="751c2e04de4e897361d5bc61b03c3ed78e5c4b5558a3751f8654ae5c125ba76f" Nov 26 12:36:45 crc kubenswrapper[4834]: I1126 12:36:45.048773 4834 scope.go:117] "RemoveContainer" containerID="a89add7f786792e2eb6b0f82ac3f6a2baf92706d68b30113fefc9c08e6a68a32" Nov 26 12:36:49 crc kubenswrapper[4834]: I1126 12:36:49.042410 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-kj94c"] Nov 26 12:36:49 crc kubenswrapper[4834]: I1126 12:36:49.050492 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-kj94c"] Nov 26 12:36:49 crc kubenswrapper[4834]: I1126 12:36:49.416679 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:36:49 crc kubenswrapper[4834]: E1126 12:36:49.416891 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:36:50 crc kubenswrapper[4834]: I1126 12:36:50.426613 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ddc5896-100d-473a-9bed-a2e13560bc8e" path="/var/lib/kubelet/pods/0ddc5896-100d-473a-9bed-a2e13560bc8e/volumes" Nov 26 12:37:02 crc kubenswrapper[4834]: I1126 12:37:02.421295 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:37:02 crc kubenswrapper[4834]: E1126 12:37:02.421963 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:37:15 crc kubenswrapper[4834]: I1126 12:37:15.417018 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:37:15 crc kubenswrapper[4834]: E1126 12:37:15.417573 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.030524 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f898-account-create-update-2b85n"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.041996 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kdm8m"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.048399 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6jvd5"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.054571 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-fecf-account-create-update-rbc66"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.059788 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-96ae-account-create-update-8pxqn"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.064332 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-696dw"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.069094 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kdm8m"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.073922 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6jvd5"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.078501 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f898-account-create-update-2b85n"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.083063 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-96ae-account-create-update-8pxqn"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.087590 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-696dw"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.092084 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-fecf-account-create-update-rbc66"] Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.425250 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ea3d77-ce69-4f54-877e-56e1a9c9183d" path="/var/lib/kubelet/pods/23ea3d77-ce69-4f54-877e-56e1a9c9183d/volumes" Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.425894 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ab7f94-6d9b-4ac8-a50a-9332f240ddf4" path="/var/lib/kubelet/pods/80ab7f94-6d9b-4ac8-a50a-9332f240ddf4/volumes" Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.426407 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93841b51-b2ec-473c-ae5d-f7f652ba6aa7" path="/var/lib/kubelet/pods/93841b51-b2ec-473c-ae5d-f7f652ba6aa7/volumes" Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.426881 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5f3da7-3847-4e39-b7d6-908f8de8740c" path="/var/lib/kubelet/pods/bc5f3da7-3847-4e39-b7d6-908f8de8740c/volumes" Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.427809 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c623e2b3-2d04-4579-bbf2-fe1aecc98118" path="/var/lib/kubelet/pods/c623e2b3-2d04-4579-bbf2-fe1aecc98118/volumes" Nov 26 12:37:16 crc kubenswrapper[4834]: I1126 12:37:16.428284 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d582581d-dd1b-48dc-8356-f911726bf78e" path="/var/lib/kubelet/pods/d582581d-dd1b-48dc-8356-f911726bf78e/volumes" Nov 26 12:37:26 crc kubenswrapper[4834]: I1126 12:37:26.417196 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:37:26 crc kubenswrapper[4834]: E1126 12:37:26.418061 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.680756 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-chwkf"] Nov 26 12:37:27 crc kubenswrapper[4834]: E1126 12:37:27.681411 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fadb4daf-e9ae-4a04-a176-5bff2d64dea6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.681436 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fadb4daf-e9ae-4a04-a176-5bff2d64dea6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.681631 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fadb4daf-e9ae-4a04-a176-5bff2d64dea6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.682811 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.695112 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-chwkf"] Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.824633 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-catalog-content\") pod \"community-operators-chwkf\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.825408 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4f7q\" (UniqueName: \"kubernetes.io/projected/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-kube-api-access-k4f7q\") pod \"community-operators-chwkf\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.825686 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-utilities\") pod \"community-operators-chwkf\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.928161 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-utilities\") pod \"community-operators-chwkf\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.928257 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-catalog-content\") pod \"community-operators-chwkf\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.928575 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4f7q\" (UniqueName: \"kubernetes.io/projected/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-kube-api-access-k4f7q\") pod \"community-operators-chwkf\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.928851 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-utilities\") pod \"community-operators-chwkf\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.928942 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-catalog-content\") pod \"community-operators-chwkf\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.953102 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4f7q\" (UniqueName: \"kubernetes.io/projected/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-kube-api-access-k4f7q\") pod \"community-operators-chwkf\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:27 crc kubenswrapper[4834]: I1126 12:37:27.997147 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:28 crc kubenswrapper[4834]: I1126 12:37:28.435094 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-chwkf"] Nov 26 12:37:29 crc kubenswrapper[4834]: I1126 12:37:29.009102 4834 generic.go:334] "Generic (PLEG): container finished" podID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerID="4735d2a140fdc3960e3304f65e47024f90d7a037bd3a641615e33efd6237d9f8" exitCode=0 Nov 26 12:37:29 crc kubenswrapper[4834]: I1126 12:37:29.009197 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-chwkf" event={"ID":"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4","Type":"ContainerDied","Data":"4735d2a140fdc3960e3304f65e47024f90d7a037bd3a641615e33efd6237d9f8"} Nov 26 12:37:29 crc kubenswrapper[4834]: I1126 12:37:29.009406 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-chwkf" event={"ID":"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4","Type":"ContainerStarted","Data":"59cacb67bc6189729999ae067e4b4d718833b5f19b262ed49f36c9efe1698fc6"} Nov 26 12:37:30 crc kubenswrapper[4834]: I1126 12:37:30.019501 4834 generic.go:334] "Generic (PLEG): container finished" podID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerID="c92b5ba0d3242d64bbaca53677af90710d39c5030c3555ee1603110b7e949b88" exitCode=0 Nov 26 12:37:30 crc kubenswrapper[4834]: I1126 12:37:30.019607 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-chwkf" event={"ID":"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4","Type":"ContainerDied","Data":"c92b5ba0d3242d64bbaca53677af90710d39c5030c3555ee1603110b7e949b88"} Nov 26 12:37:31 crc kubenswrapper[4834]: I1126 12:37:31.037245 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-chwkf" event={"ID":"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4","Type":"ContainerStarted","Data":"7485dac982b19d5e402c8f72db10e3cb6ae97fd87441f8ee9c0ac5d819ff5324"} Nov 26 12:37:31 crc kubenswrapper[4834]: I1126 12:37:31.055834 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-chwkf" podStartSLOduration=2.42819006 podStartE2EDuration="4.055820921s" podCreationTimestamp="2025-11-26 12:37:27 +0000 UTC" firstStartedPulling="2025-11-26 12:37:29.010635654 +0000 UTC m=+1546.917849006" lastFinishedPulling="2025-11-26 12:37:30.638266515 +0000 UTC m=+1548.545479867" observedRunningTime="2025-11-26 12:37:31.051013543 +0000 UTC m=+1548.958226895" watchObservedRunningTime="2025-11-26 12:37:31.055820921 +0000 UTC m=+1548.963034273" Nov 26 12:37:36 crc kubenswrapper[4834]: I1126 12:37:36.021074 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x7vq5"] Nov 26 12:37:36 crc kubenswrapper[4834]: I1126 12:37:36.026820 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x7vq5"] Nov 26 12:37:36 crc kubenswrapper[4834]: I1126 12:37:36.425384 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2eb7a31-4805-49e5-81cc-58208e57f440" path="/var/lib/kubelet/pods/a2eb7a31-4805-49e5-81cc-58208e57f440/volumes" Nov 26 12:37:37 crc kubenswrapper[4834]: I1126 12:37:37.901358 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xxczx"] Nov 26 12:37:37 crc kubenswrapper[4834]: I1126 12:37:37.903687 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:37 crc kubenswrapper[4834]: I1126 12:37:37.909148 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxczx"] Nov 26 12:37:37 crc kubenswrapper[4834]: I1126 12:37:37.919151 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-utilities\") pod \"redhat-operators-xxczx\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:37 crc kubenswrapper[4834]: I1126 12:37:37.919471 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-catalog-content\") pod \"redhat-operators-xxczx\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:37 crc kubenswrapper[4834]: I1126 12:37:37.919543 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g859f\" (UniqueName: \"kubernetes.io/projected/090b1071-ced5-49e9-9fbe-353ce3688fac-kube-api-access-g859f\") pod \"redhat-operators-xxczx\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:37 crc kubenswrapper[4834]: I1126 12:37:37.997785 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:37 crc kubenswrapper[4834]: I1126 12:37:37.997830 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.021280 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-utilities\") pod \"redhat-operators-xxczx\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.021329 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-catalog-content\") pod \"redhat-operators-xxczx\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.021349 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g859f\" (UniqueName: \"kubernetes.io/projected/090b1071-ced5-49e9-9fbe-353ce3688fac-kube-api-access-g859f\") pod \"redhat-operators-xxczx\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.021717 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-utilities\") pod \"redhat-operators-xxczx\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.021786 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-catalog-content\") pod \"redhat-operators-xxczx\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.031274 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.041996 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g859f\" (UniqueName: \"kubernetes.io/projected/090b1071-ced5-49e9-9fbe-353ce3688fac-kube-api-access-g859f\") pod \"redhat-operators-xxczx\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.110412 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.218478 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:38 crc kubenswrapper[4834]: I1126 12:37:38.621943 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxczx"] Nov 26 12:37:39 crc kubenswrapper[4834]: I1126 12:37:39.088576 4834 generic.go:334] "Generic (PLEG): container finished" podID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerID="ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f" exitCode=0 Nov 26 12:37:39 crc kubenswrapper[4834]: I1126 12:37:39.088675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxczx" event={"ID":"090b1071-ced5-49e9-9fbe-353ce3688fac","Type":"ContainerDied","Data":"ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f"} Nov 26 12:37:39 crc kubenswrapper[4834]: I1126 12:37:39.088845 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxczx" event={"ID":"090b1071-ced5-49e9-9fbe-353ce3688fac","Type":"ContainerStarted","Data":"47b56e4a435f596bdd5cf0a08fc2ac6d3c8a5d2f7256c015ffebe2c0832ac46b"} Nov 26 12:37:40 crc kubenswrapper[4834]: I1126 12:37:40.095863 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxczx" event={"ID":"090b1071-ced5-49e9-9fbe-353ce3688fac","Type":"ContainerStarted","Data":"1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177"} Nov 26 12:37:40 crc kubenswrapper[4834]: I1126 12:37:40.285955 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-chwkf"] Nov 26 12:37:40 crc kubenswrapper[4834]: I1126 12:37:40.286140 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-chwkf" podUID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerName="registry-server" containerID="cri-o://7485dac982b19d5e402c8f72db10e3cb6ae97fd87441f8ee9c0ac5d819ff5324" gracePeriod=2 Nov 26 12:37:41 crc kubenswrapper[4834]: I1126 12:37:41.416920 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:37:41 crc kubenswrapper[4834]: E1126 12:37:41.417324 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.110089 4834 generic.go:334] "Generic (PLEG): container finished" podID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerID="1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177" exitCode=0 Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.110167 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxczx" event={"ID":"090b1071-ced5-49e9-9fbe-353ce3688fac","Type":"ContainerDied","Data":"1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177"} Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.111845 4834 generic.go:334] "Generic (PLEG): container finished" podID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerID="7485dac982b19d5e402c8f72db10e3cb6ae97fd87441f8ee9c0ac5d819ff5324" exitCode=0 Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.111908 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-chwkf" event={"ID":"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4","Type":"ContainerDied","Data":"7485dac982b19d5e402c8f72db10e3cb6ae97fd87441f8ee9c0ac5d819ff5324"} Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.475936 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.593180 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-utilities\") pod \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.593268 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4f7q\" (UniqueName: \"kubernetes.io/projected/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-kube-api-access-k4f7q\") pod \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.593521 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-catalog-content\") pod \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\" (UID: \"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4\") " Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.594031 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-utilities" (OuterVolumeSpecName: "utilities") pod "ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" (UID: "ca58eb9e-9ff2-4be9-ba78-39c6017a54d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.594637 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.598466 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-kube-api-access-k4f7q" (OuterVolumeSpecName: "kube-api-access-k4f7q") pod "ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" (UID: "ca58eb9e-9ff2-4be9-ba78-39c6017a54d4"). InnerVolumeSpecName "kube-api-access-k4f7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.640484 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" (UID: "ca58eb9e-9ff2-4be9-ba78-39c6017a54d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.697259 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4f7q\" (UniqueName: \"kubernetes.io/projected/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-kube-api-access-k4f7q\") on node \"crc\" DevicePath \"\"" Nov 26 12:37:42 crc kubenswrapper[4834]: I1126 12:37:42.697534 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:37:43 crc kubenswrapper[4834]: I1126 12:37:43.120915 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxczx" event={"ID":"090b1071-ced5-49e9-9fbe-353ce3688fac","Type":"ContainerStarted","Data":"efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3"} Nov 26 12:37:43 crc kubenswrapper[4834]: I1126 12:37:43.123183 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-chwkf" event={"ID":"ca58eb9e-9ff2-4be9-ba78-39c6017a54d4","Type":"ContainerDied","Data":"59cacb67bc6189729999ae067e4b4d718833b5f19b262ed49f36c9efe1698fc6"} Nov 26 12:37:43 crc kubenswrapper[4834]: I1126 12:37:43.123288 4834 scope.go:117] "RemoveContainer" containerID="7485dac982b19d5e402c8f72db10e3cb6ae97fd87441f8ee9c0ac5d819ff5324" Nov 26 12:37:43 crc kubenswrapper[4834]: I1126 12:37:43.123333 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-chwkf" Nov 26 12:37:43 crc kubenswrapper[4834]: I1126 12:37:43.138782 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xxczx" podStartSLOduration=2.593669085 podStartE2EDuration="6.138767003s" podCreationTimestamp="2025-11-26 12:37:37 +0000 UTC" firstStartedPulling="2025-11-26 12:37:39.090363204 +0000 UTC m=+1556.997576556" lastFinishedPulling="2025-11-26 12:37:42.635461112 +0000 UTC m=+1560.542674474" observedRunningTime="2025-11-26 12:37:43.135903201 +0000 UTC m=+1561.043116553" watchObservedRunningTime="2025-11-26 12:37:43.138767003 +0000 UTC m=+1561.045980355" Nov 26 12:37:43 crc kubenswrapper[4834]: I1126 12:37:43.153246 4834 scope.go:117] "RemoveContainer" containerID="c92b5ba0d3242d64bbaca53677af90710d39c5030c3555ee1603110b7e949b88" Nov 26 12:37:43 crc kubenswrapper[4834]: I1126 12:37:43.155732 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-chwkf"] Nov 26 12:37:43 crc kubenswrapper[4834]: I1126 12:37:43.162546 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-chwkf"] Nov 26 12:37:43 crc kubenswrapper[4834]: I1126 12:37:43.173279 4834 scope.go:117] "RemoveContainer" containerID="4735d2a140fdc3960e3304f65e47024f90d7a037bd3a641615e33efd6237d9f8" Nov 26 12:37:44 crc kubenswrapper[4834]: I1126 12:37:44.425238 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" path="/var/lib/kubelet/pods/ca58eb9e-9ff2-4be9-ba78-39c6017a54d4/volumes" Nov 26 12:37:45 crc kubenswrapper[4834]: I1126 12:37:45.187232 4834 scope.go:117] "RemoveContainer" containerID="40e3be5605fd15f174e3be8cd4912cdadd09e826064b88ce16cd1b29ade9a15b" Nov 26 12:37:45 crc kubenswrapper[4834]: I1126 12:37:45.208224 4834 scope.go:117] "RemoveContainer" containerID="d85b75b77650a7f0b61cde22b43dda4aa8963631b76a43a92777ca6f7f227752" Nov 26 12:37:45 crc kubenswrapper[4834]: I1126 12:37:45.238120 4834 scope.go:117] "RemoveContainer" containerID="778784c4f5c14b5e5c3b18c0961907593fdfc7d78ac0e2cae4afda7faa6c1b4b" Nov 26 12:37:45 crc kubenswrapper[4834]: I1126 12:37:45.270974 4834 scope.go:117] "RemoveContainer" containerID="e961004ab1d8e0705e4cd7dc27c910f3006255fa1cdaddd53953741cc52444d3" Nov 26 12:37:45 crc kubenswrapper[4834]: I1126 12:37:45.308906 4834 scope.go:117] "RemoveContainer" containerID="748d665e660676e17ae4ee029f0a9abd2545dc41c8b5fae14feade09b0a33fed" Nov 26 12:37:45 crc kubenswrapper[4834]: I1126 12:37:45.327299 4834 scope.go:117] "RemoveContainer" containerID="b6f58900e8721452ffca28c24f35a59f242fd9d8179dbe33d6c438d6f0c41d60" Nov 26 12:37:45 crc kubenswrapper[4834]: I1126 12:37:45.367686 4834 scope.go:117] "RemoveContainer" containerID="9ed195b735780020224948fd1aa2e26a0c4a55439af451293798f9648ffa3046" Nov 26 12:37:45 crc kubenswrapper[4834]: I1126 12:37:45.384354 4834 scope.go:117] "RemoveContainer" containerID="586c00aebc9cd36d3de7a558343e02479269bdedd2bef06bdcee8c8f62d001e8" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.692192 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2j7kq"] Nov 26 12:37:47 crc kubenswrapper[4834]: E1126 12:37:47.693538 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerName="extract-utilities" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.693604 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerName="extract-utilities" Nov 26 12:37:47 crc kubenswrapper[4834]: E1126 12:37:47.693670 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerName="registry-server" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.693721 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerName="registry-server" Nov 26 12:37:47 crc kubenswrapper[4834]: E1126 12:37:47.693766 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerName="extract-content" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.693813 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerName="extract-content" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.694025 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca58eb9e-9ff2-4be9-ba78-39c6017a54d4" containerName="registry-server" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.695231 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.702364 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j7kq"] Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.779164 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-utilities\") pod \"redhat-marketplace-2j7kq\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.779198 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-catalog-content\") pod \"redhat-marketplace-2j7kq\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.779278 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th5lz\" (UniqueName: \"kubernetes.io/projected/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-kube-api-access-th5lz\") pod \"redhat-marketplace-2j7kq\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.881004 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-utilities\") pod \"redhat-marketplace-2j7kq\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.881219 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-catalog-content\") pod \"redhat-marketplace-2j7kq\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.881289 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th5lz\" (UniqueName: \"kubernetes.io/projected/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-kube-api-access-th5lz\") pod \"redhat-marketplace-2j7kq\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.881496 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-utilities\") pod \"redhat-marketplace-2j7kq\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.881734 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-catalog-content\") pod \"redhat-marketplace-2j7kq\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:47 crc kubenswrapper[4834]: I1126 12:37:47.896547 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th5lz\" (UniqueName: \"kubernetes.io/projected/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-kube-api-access-th5lz\") pod \"redhat-marketplace-2j7kq\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:48 crc kubenswrapper[4834]: I1126 12:37:48.009120 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:48 crc kubenswrapper[4834]: I1126 12:37:48.218849 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:48 crc kubenswrapper[4834]: I1126 12:37:48.218896 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:48 crc kubenswrapper[4834]: I1126 12:37:48.252094 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:48 crc kubenswrapper[4834]: I1126 12:37:48.382172 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j7kq"] Nov 26 12:37:49 crc kubenswrapper[4834]: I1126 12:37:49.165010 4834 generic.go:334] "Generic (PLEG): container finished" podID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerID="566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53" exitCode=0 Nov 26 12:37:49 crc kubenswrapper[4834]: I1126 12:37:49.165046 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j7kq" event={"ID":"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d","Type":"ContainerDied","Data":"566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53"} Nov 26 12:37:49 crc kubenswrapper[4834]: I1126 12:37:49.165287 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j7kq" event={"ID":"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d","Type":"ContainerStarted","Data":"2987fa31516bc21309f5c61bf8b80aeac305f7667fa254abe621284ce4379d80"} Nov 26 12:37:49 crc kubenswrapper[4834]: I1126 12:37:49.199226 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:50 crc kubenswrapper[4834]: I1126 12:37:50.174115 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j7kq" event={"ID":"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d","Type":"ContainerStarted","Data":"45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405"} Nov 26 12:37:50 crc kubenswrapper[4834]: I1126 12:37:50.486875 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxczx"] Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.181971 4834 generic.go:334] "Generic (PLEG): container finished" podID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerID="45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405" exitCode=0 Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.182040 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j7kq" event={"ID":"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d","Type":"ContainerDied","Data":"45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405"} Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.182766 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xxczx" podUID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerName="registry-server" containerID="cri-o://efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3" gracePeriod=2 Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.538603 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.542335 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-utilities\") pod \"090b1071-ced5-49e9-9fbe-353ce3688fac\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.543022 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-utilities" (OuterVolumeSpecName: "utilities") pod "090b1071-ced5-49e9-9fbe-353ce3688fac" (UID: "090b1071-ced5-49e9-9fbe-353ce3688fac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.643631 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-catalog-content\") pod \"090b1071-ced5-49e9-9fbe-353ce3688fac\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.643751 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g859f\" (UniqueName: \"kubernetes.io/projected/090b1071-ced5-49e9-9fbe-353ce3688fac-kube-api-access-g859f\") pod \"090b1071-ced5-49e9-9fbe-353ce3688fac\" (UID: \"090b1071-ced5-49e9-9fbe-353ce3688fac\") " Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.644034 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.648633 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090b1071-ced5-49e9-9fbe-353ce3688fac-kube-api-access-g859f" (OuterVolumeSpecName: "kube-api-access-g859f") pod "090b1071-ced5-49e9-9fbe-353ce3688fac" (UID: "090b1071-ced5-49e9-9fbe-353ce3688fac"). InnerVolumeSpecName "kube-api-access-g859f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.711506 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "090b1071-ced5-49e9-9fbe-353ce3688fac" (UID: "090b1071-ced5-49e9-9fbe-353ce3688fac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.745015 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090b1071-ced5-49e9-9fbe-353ce3688fac-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:37:51 crc kubenswrapper[4834]: I1126 12:37:51.745040 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g859f\" (UniqueName: \"kubernetes.io/projected/090b1071-ced5-49e9-9fbe-353ce3688fac-kube-api-access-g859f\") on node \"crc\" DevicePath \"\"" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.194608 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j7kq" event={"ID":"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d","Type":"ContainerStarted","Data":"1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2"} Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.197064 4834 generic.go:334] "Generic (PLEG): container finished" podID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerID="efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3" exitCode=0 Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.197092 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxczx" event={"ID":"090b1071-ced5-49e9-9fbe-353ce3688fac","Type":"ContainerDied","Data":"efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3"} Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.197109 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxczx" event={"ID":"090b1071-ced5-49e9-9fbe-353ce3688fac","Type":"ContainerDied","Data":"47b56e4a435f596bdd5cf0a08fc2ac6d3c8a5d2f7256c015ffebe2c0832ac46b"} Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.197124 4834 scope.go:117] "RemoveContainer" containerID="efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.197139 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxczx" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.210184 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2j7kq" podStartSLOduration=2.562378723 podStartE2EDuration="5.210170401s" podCreationTimestamp="2025-11-26 12:37:47 +0000 UTC" firstStartedPulling="2025-11-26 12:37:49.166462881 +0000 UTC m=+1567.073676232" lastFinishedPulling="2025-11-26 12:37:51.814254558 +0000 UTC m=+1569.721467910" observedRunningTime="2025-11-26 12:37:52.208942626 +0000 UTC m=+1570.116155978" watchObservedRunningTime="2025-11-26 12:37:52.210170401 +0000 UTC m=+1570.117383753" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.214957 4834 scope.go:117] "RemoveContainer" containerID="1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.225118 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxczx"] Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.230960 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xxczx"] Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.251239 4834 scope.go:117] "RemoveContainer" containerID="ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.268997 4834 scope.go:117] "RemoveContainer" containerID="efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3" Nov 26 12:37:52 crc kubenswrapper[4834]: E1126 12:37:52.269262 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3\": container with ID starting with efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3 not found: ID does not exist" containerID="efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.269375 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3"} err="failed to get container status \"efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3\": rpc error: code = NotFound desc = could not find container \"efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3\": container with ID starting with efedbcf16aaf90b5b960933538570d2aeb3c428e757399ca95527b8a1f2481d3 not found: ID does not exist" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.269457 4834 scope.go:117] "RemoveContainer" containerID="1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177" Nov 26 12:37:52 crc kubenswrapper[4834]: E1126 12:37:52.269714 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177\": container with ID starting with 1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177 not found: ID does not exist" containerID="1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.269735 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177"} err="failed to get container status \"1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177\": rpc error: code = NotFound desc = could not find container \"1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177\": container with ID starting with 1e83e4e2f143b8a1c76bc645db0d29571dd82d3e36934c40d6b9210c09923177 not found: ID does not exist" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.269750 4834 scope.go:117] "RemoveContainer" containerID="ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f" Nov 26 12:37:52 crc kubenswrapper[4834]: E1126 12:37:52.270069 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f\": container with ID starting with ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f not found: ID does not exist" containerID="ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.270137 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f"} err="failed to get container status \"ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f\": rpc error: code = NotFound desc = could not find container \"ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f\": container with ID starting with ea55dce3a4824763aef65bf3410757ab63bea45c3eca5e6e3bf51a09da89b43f not found: ID does not exist" Nov 26 12:37:52 crc kubenswrapper[4834]: I1126 12:37:52.425758 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090b1071-ced5-49e9-9fbe-353ce3688fac" path="/var/lib/kubelet/pods/090b1071-ced5-49e9-9fbe-353ce3688fac/volumes" Nov 26 12:37:55 crc kubenswrapper[4834]: I1126 12:37:55.417338 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:37:55 crc kubenswrapper[4834]: E1126 12:37:55.417751 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:37:57 crc kubenswrapper[4834]: I1126 12:37:57.029568 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gnszd"] Nov 26 12:37:57 crc kubenswrapper[4834]: I1126 12:37:57.035891 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-b558t"] Nov 26 12:37:57 crc kubenswrapper[4834]: I1126 12:37:57.042417 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gnszd"] Nov 26 12:37:57 crc kubenswrapper[4834]: I1126 12:37:57.049280 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-b558t"] Nov 26 12:37:58 crc kubenswrapper[4834]: I1126 12:37:58.010067 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:58 crc kubenswrapper[4834]: I1126 12:37:58.010155 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:58 crc kubenswrapper[4834]: I1126 12:37:58.042273 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:58 crc kubenswrapper[4834]: I1126 12:37:58.263919 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:37:58 crc kubenswrapper[4834]: I1126 12:37:58.425408 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="163ea76c-b946-4d71-ab57-fc60b515cced" path="/var/lib/kubelet/pods/163ea76c-b946-4d71-ab57-fc60b515cced/volumes" Nov 26 12:37:58 crc kubenswrapper[4834]: I1126 12:37:58.425910 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54e62ed-7746-4227-957c-febe65052a53" path="/var/lib/kubelet/pods/f54e62ed-7746-4227-957c-febe65052a53/volumes" Nov 26 12:37:58 crc kubenswrapper[4834]: I1126 12:37:58.879503 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j7kq"] Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.254794 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2j7kq" podUID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerName="registry-server" containerID="cri-o://1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2" gracePeriod=2 Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.631168 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.684417 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th5lz\" (UniqueName: \"kubernetes.io/projected/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-kube-api-access-th5lz\") pod \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.684586 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-catalog-content\") pod \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.684665 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-utilities\") pod \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\" (UID: \"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d\") " Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.686138 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-utilities" (OuterVolumeSpecName: "utilities") pod "f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" (UID: "f6f27190-ed0f-4b40-b5ee-99d8843f5e3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.688956 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-kube-api-access-th5lz" (OuterVolumeSpecName: "kube-api-access-th5lz") pod "f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" (UID: "f6f27190-ed0f-4b40-b5ee-99d8843f5e3d"). InnerVolumeSpecName "kube-api-access-th5lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.698239 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" (UID: "f6f27190-ed0f-4b40-b5ee-99d8843f5e3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.787503 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.787538 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:38:00 crc kubenswrapper[4834]: I1126 12:38:00.787550 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th5lz\" (UniqueName: \"kubernetes.io/projected/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d-kube-api-access-th5lz\") on node \"crc\" DevicePath \"\"" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.270366 4834 generic.go:334] "Generic (PLEG): container finished" podID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerID="1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2" exitCode=0 Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.270434 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2j7kq" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.270438 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j7kq" event={"ID":"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d","Type":"ContainerDied","Data":"1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2"} Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.270538 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j7kq" event={"ID":"f6f27190-ed0f-4b40-b5ee-99d8843f5e3d","Type":"ContainerDied","Data":"2987fa31516bc21309f5c61bf8b80aeac305f7667fa254abe621284ce4379d80"} Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.270557 4834 scope.go:117] "RemoveContainer" containerID="1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.287734 4834 scope.go:117] "RemoveContainer" containerID="45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.302868 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j7kq"] Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.308863 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j7kq"] Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.319287 4834 scope.go:117] "RemoveContainer" containerID="566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.340681 4834 scope.go:117] "RemoveContainer" containerID="1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2" Nov 26 12:38:01 crc kubenswrapper[4834]: E1126 12:38:01.341099 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2\": container with ID starting with 1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2 not found: ID does not exist" containerID="1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.341145 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2"} err="failed to get container status \"1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2\": rpc error: code = NotFound desc = could not find container \"1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2\": container with ID starting with 1695158bbc2b0a023da194949e2b75e15a0d274a72bb8f35b5aa48f490c6d1a2 not found: ID does not exist" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.341168 4834 scope.go:117] "RemoveContainer" containerID="45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405" Nov 26 12:38:01 crc kubenswrapper[4834]: E1126 12:38:01.341726 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405\": container with ID starting with 45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405 not found: ID does not exist" containerID="45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.341756 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405"} err="failed to get container status \"45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405\": rpc error: code = NotFound desc = could not find container \"45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405\": container with ID starting with 45c9d612d586b4dc3cf3f959711095be0cc33371054a8750aa2c2ec7efcb4405 not found: ID does not exist" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.341783 4834 scope.go:117] "RemoveContainer" containerID="566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53" Nov 26 12:38:01 crc kubenswrapper[4834]: E1126 12:38:01.342047 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53\": container with ID starting with 566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53 not found: ID does not exist" containerID="566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53" Nov 26 12:38:01 crc kubenswrapper[4834]: I1126 12:38:01.342096 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53"} err="failed to get container status \"566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53\": rpc error: code = NotFound desc = could not find container \"566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53\": container with ID starting with 566cb5b2fcaa38318e83ec7d2f161bb1f8a3ebdb4d38ed6b9b0cd15f6ded0c53 not found: ID does not exist" Nov 26 12:38:02 crc kubenswrapper[4834]: I1126 12:38:02.441683 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" path="/var/lib/kubelet/pods/f6f27190-ed0f-4b40-b5ee-99d8843f5e3d/volumes" Nov 26 12:38:06 crc kubenswrapper[4834]: I1126 12:38:06.418333 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:38:06 crc kubenswrapper[4834]: E1126 12:38:06.419121 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.612484 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d5wbg"] Nov 26 12:38:15 crc kubenswrapper[4834]: E1126 12:38:15.613540 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerName="extract-content" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.613556 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerName="extract-content" Nov 26 12:38:15 crc kubenswrapper[4834]: E1126 12:38:15.613575 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerName="registry-server" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.613586 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerName="registry-server" Nov 26 12:38:15 crc kubenswrapper[4834]: E1126 12:38:15.613600 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerName="extract-utilities" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.613607 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerName="extract-utilities" Nov 26 12:38:15 crc kubenswrapper[4834]: E1126 12:38:15.613617 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerName="registry-server" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.613622 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerName="registry-server" Nov 26 12:38:15 crc kubenswrapper[4834]: E1126 12:38:15.613642 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerName="extract-utilities" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.613647 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerName="extract-utilities" Nov 26 12:38:15 crc kubenswrapper[4834]: E1126 12:38:15.613661 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerName="extract-content" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.613667 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerName="extract-content" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.613838 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="090b1071-ced5-49e9-9fbe-353ce3688fac" containerName="registry-server" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.613855 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f27190-ed0f-4b40-b5ee-99d8843f5e3d" containerName="registry-server" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.615295 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.620364 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5wbg"] Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.669188 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-utilities\") pod \"certified-operators-d5wbg\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.669387 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-catalog-content\") pod \"certified-operators-d5wbg\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.669439 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2pnv\" (UniqueName: \"kubernetes.io/projected/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-kube-api-access-c2pnv\") pod \"certified-operators-d5wbg\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.771528 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2pnv\" (UniqueName: \"kubernetes.io/projected/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-kube-api-access-c2pnv\") pod \"certified-operators-d5wbg\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.771652 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-utilities\") pod \"certified-operators-d5wbg\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.771736 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-catalog-content\") pod \"certified-operators-d5wbg\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.772116 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-catalog-content\") pod \"certified-operators-d5wbg\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.772202 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-utilities\") pod \"certified-operators-d5wbg\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.788480 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2pnv\" (UniqueName: \"kubernetes.io/projected/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-kube-api-access-c2pnv\") pod \"certified-operators-d5wbg\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:15 crc kubenswrapper[4834]: I1126 12:38:15.934732 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:16 crc kubenswrapper[4834]: I1126 12:38:16.186635 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5wbg"] Nov 26 12:38:16 crc kubenswrapper[4834]: I1126 12:38:16.394952 4834 generic.go:334] "Generic (PLEG): container finished" podID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerID="0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202" exitCode=0 Nov 26 12:38:16 crc kubenswrapper[4834]: I1126 12:38:16.395072 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5wbg" event={"ID":"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f","Type":"ContainerDied","Data":"0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202"} Nov 26 12:38:16 crc kubenswrapper[4834]: I1126 12:38:16.395707 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5wbg" event={"ID":"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f","Type":"ContainerStarted","Data":"5e8298209ddeb0159e712ce7805abfe665d643e5ecab2f3cf617fad1fcf39720"} Nov 26 12:38:17 crc kubenswrapper[4834]: I1126 12:38:17.417615 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:38:17 crc kubenswrapper[4834]: E1126 12:38:17.417963 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:38:18 crc kubenswrapper[4834]: I1126 12:38:18.415081 4834 generic.go:334] "Generic (PLEG): container finished" podID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerID="1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47" exitCode=0 Nov 26 12:38:18 crc kubenswrapper[4834]: I1126 12:38:18.415136 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5wbg" event={"ID":"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f","Type":"ContainerDied","Data":"1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47"} Nov 26 12:38:20 crc kubenswrapper[4834]: I1126 12:38:20.435282 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5wbg" event={"ID":"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f","Type":"ContainerStarted","Data":"6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4"} Nov 26 12:38:20 crc kubenswrapper[4834]: I1126 12:38:20.459506 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d5wbg" podStartSLOduration=2.590338787 podStartE2EDuration="5.459486121s" podCreationTimestamp="2025-11-26 12:38:15 +0000 UTC" firstStartedPulling="2025-11-26 12:38:16.397767228 +0000 UTC m=+1594.304980580" lastFinishedPulling="2025-11-26 12:38:19.266914562 +0000 UTC m=+1597.174127914" observedRunningTime="2025-11-26 12:38:20.456521049 +0000 UTC m=+1598.363734401" watchObservedRunningTime="2025-11-26 12:38:20.459486121 +0000 UTC m=+1598.366699474" Nov 26 12:38:25 crc kubenswrapper[4834]: I1126 12:38:25.935372 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:25 crc kubenswrapper[4834]: I1126 12:38:25.935960 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:25 crc kubenswrapper[4834]: I1126 12:38:25.976439 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:26 crc kubenswrapper[4834]: I1126 12:38:26.517844 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:26 crc kubenswrapper[4834]: I1126 12:38:26.563414 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5wbg"] Nov 26 12:38:28 crc kubenswrapper[4834]: I1126 12:38:28.495617 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d5wbg" podUID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerName="registry-server" containerID="cri-o://6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4" gracePeriod=2 Nov 26 12:38:28 crc kubenswrapper[4834]: I1126 12:38:28.887739 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.033371 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-catalog-content\") pod \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.033797 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-utilities\") pod \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.034170 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2pnv\" (UniqueName: \"kubernetes.io/projected/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-kube-api-access-c2pnv\") pod \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\" (UID: \"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f\") " Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.034527 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-utilities" (OuterVolumeSpecName: "utilities") pod "3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" (UID: "3bc02d4e-1ec3-4149-b00d-00d5db80bf9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.034963 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.039454 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-kube-api-access-c2pnv" (OuterVolumeSpecName: "kube-api-access-c2pnv") pod "3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" (UID: "3bc02d4e-1ec3-4149-b00d-00d5db80bf9f"). InnerVolumeSpecName "kube-api-access-c2pnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.067816 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" (UID: "3bc02d4e-1ec3-4149-b00d-00d5db80bf9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.135680 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.135706 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2pnv\" (UniqueName: \"kubernetes.io/projected/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f-kube-api-access-c2pnv\") on node \"crc\" DevicePath \"\"" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.503290 4834 generic.go:334] "Generic (PLEG): container finished" podID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerID="6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4" exitCode=0 Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.503341 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5wbg" event={"ID":"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f","Type":"ContainerDied","Data":"6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4"} Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.503363 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5wbg" event={"ID":"3bc02d4e-1ec3-4149-b00d-00d5db80bf9f","Type":"ContainerDied","Data":"5e8298209ddeb0159e712ce7805abfe665d643e5ecab2f3cf617fad1fcf39720"} Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.503379 4834 scope.go:117] "RemoveContainer" containerID="6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.503494 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5wbg" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.524171 4834 scope.go:117] "RemoveContainer" containerID="1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.529050 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5wbg"] Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.534567 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d5wbg"] Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.561286 4834 scope.go:117] "RemoveContainer" containerID="0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.582460 4834 scope.go:117] "RemoveContainer" containerID="6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4" Nov 26 12:38:29 crc kubenswrapper[4834]: E1126 12:38:29.582796 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4\": container with ID starting with 6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4 not found: ID does not exist" containerID="6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.582828 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4"} err="failed to get container status \"6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4\": rpc error: code = NotFound desc = could not find container \"6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4\": container with ID starting with 6277ccacfea06eaa8bedca90b0f7a21ad501564bc74238f81ede3447533355e4 not found: ID does not exist" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.582851 4834 scope.go:117] "RemoveContainer" containerID="1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47" Nov 26 12:38:29 crc kubenswrapper[4834]: E1126 12:38:29.583159 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47\": container with ID starting with 1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47 not found: ID does not exist" containerID="1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.583206 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47"} err="failed to get container status \"1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47\": rpc error: code = NotFound desc = could not find container \"1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47\": container with ID starting with 1ea90e3647becd34c0df06cd28e631db7cf79a83669609ae47504a499e4a5a47 not found: ID does not exist" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.583232 4834 scope.go:117] "RemoveContainer" containerID="0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202" Nov 26 12:38:29 crc kubenswrapper[4834]: E1126 12:38:29.583636 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202\": container with ID starting with 0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202 not found: ID does not exist" containerID="0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202" Nov 26 12:38:29 crc kubenswrapper[4834]: I1126 12:38:29.583660 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202"} err="failed to get container status \"0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202\": rpc error: code = NotFound desc = could not find container \"0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202\": container with ID starting with 0b85179284f602d98fee97ee2b23f0338544be5c47e7356582cb267f2a4b8202 not found: ID does not exist" Nov 26 12:38:30 crc kubenswrapper[4834]: I1126 12:38:30.428402 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" path="/var/lib/kubelet/pods/3bc02d4e-1ec3-4149-b00d-00d5db80bf9f/volumes" Nov 26 12:38:31 crc kubenswrapper[4834]: I1126 12:38:31.416996 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:38:31 crc kubenswrapper[4834]: E1126 12:38:31.417491 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:38:39 crc kubenswrapper[4834]: I1126 12:38:39.040434 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4r524"] Nov 26 12:38:39 crc kubenswrapper[4834]: I1126 12:38:39.047747 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4r524"] Nov 26 12:38:40 crc kubenswrapper[4834]: I1126 12:38:40.428167 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18" path="/var/lib/kubelet/pods/7e3de3d4-3aa8-4e96-9e1f-3bf028e10e18/volumes" Nov 26 12:38:45 crc kubenswrapper[4834]: I1126 12:38:45.418756 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:38:45 crc kubenswrapper[4834]: E1126 12:38:45.419858 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:38:45 crc kubenswrapper[4834]: I1126 12:38:45.503713 4834 scope.go:117] "RemoveContainer" containerID="b71b9afc11dd45f6015d69187b8a583d7a421bbd5b2d5db75e66cd6ac4b57ee5" Nov 26 12:38:45 crc kubenswrapper[4834]: I1126 12:38:45.540894 4834 scope.go:117] "RemoveContainer" containerID="237ac414a7e30fbabaa31c28f26bdb44891e631402bca7a56cd5e08e0521f6fe" Nov 26 12:38:45 crc kubenswrapper[4834]: I1126 12:38:45.588356 4834 scope.go:117] "RemoveContainer" containerID="4842915b89f15ec8a244c43ba00fb3d189cd6a6d355104f223c4f056b47ea8f3" Nov 26 12:38:58 crc kubenswrapper[4834]: I1126 12:38:58.417520 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:38:58 crc kubenswrapper[4834]: E1126 12:38:58.418569 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:39:09 crc kubenswrapper[4834]: I1126 12:39:09.416840 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:39:09 crc kubenswrapper[4834]: E1126 12:39:09.417657 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:39:22 crc kubenswrapper[4834]: I1126 12:39:22.422554 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:39:22 crc kubenswrapper[4834]: E1126 12:39:22.423456 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:39:34 crc kubenswrapper[4834]: I1126 12:39:34.417684 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:39:34 crc kubenswrapper[4834]: E1126 12:39:34.418395 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:39:45 crc kubenswrapper[4834]: I1126 12:39:45.417591 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:39:45 crc kubenswrapper[4834]: E1126 12:39:45.418584 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.489403 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.498954 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-txqsg"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.507323 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.513320 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.518805 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pbfgn"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.523602 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.528253 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.532681 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.537045 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldm9w"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.541285 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.547586 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gjnrf"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.550344 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.554630 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ldfn2"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.561768 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cz65w"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.567108 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bglxj"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.571492 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pbfgn"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.575821 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.580036 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bq88h"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.584508 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-64nd8"] Nov 26 12:39:47 crc kubenswrapper[4834]: I1126 12:39:47.588687 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lnmfz"] Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.429382 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3070833c-b52d-4d58-baf2-76bbf8a315b3" path="/var/lib/kubelet/pods/3070833c-b52d-4d58-baf2-76bbf8a315b3/volumes" Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.430064 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="442421c1-5218-4074-8ee3-6673c9e16308" path="/var/lib/kubelet/pods/442421c1-5218-4074-8ee3-6673c9e16308/volumes" Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.430566 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6f6d49-1d40-44e3-8e84-25ea8a06743e" path="/var/lib/kubelet/pods/4a6f6d49-1d40-44e3-8e84-25ea8a06743e/volumes" Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.431032 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c9f98e-072f-4f6b-928e-3ea0d2f44cc4" path="/var/lib/kubelet/pods/71c9f98e-072f-4f6b-928e-3ea0d2f44cc4/volumes" Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.431959 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798933d4-3b38-48e2-8e14-0272d7daf788" path="/var/lib/kubelet/pods/798933d4-3b38-48e2-8e14-0272d7daf788/volumes" Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.432431 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c" path="/var/lib/kubelet/pods/8c66d4c3-bb9a-45a5-8ab2-3902c3289f9c/volumes" Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.432897 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb41a5cd-cab1-4a34-a435-d5a34b7a21a1" path="/var/lib/kubelet/pods/cb41a5cd-cab1-4a34-a435-d5a34b7a21a1/volumes" Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.433799 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc7c233f-2fa5-4441-a8e2-229bc771e093" path="/var/lib/kubelet/pods/cc7c233f-2fa5-4441-a8e2-229bc771e093/volumes" Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.434251 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d204d6c9-614d-4ee7-9e82-e4d3b2402a43" path="/var/lib/kubelet/pods/d204d6c9-614d-4ee7-9e82-e4d3b2402a43/volumes" Nov 26 12:39:48 crc kubenswrapper[4834]: I1126 12:39:48.434737 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fadb4daf-e9ae-4a04-a176-5bff2d64dea6" path="/var/lib/kubelet/pods/fadb4daf-e9ae-4a04-a176-5bff2d64dea6/volumes" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.362131 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn"] Nov 26 12:39:52 crc kubenswrapper[4834]: E1126 12:39:52.362690 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerName="extract-utilities" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.362705 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerName="extract-utilities" Nov 26 12:39:52 crc kubenswrapper[4834]: E1126 12:39:52.362719 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerName="extract-content" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.362724 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerName="extract-content" Nov 26 12:39:52 crc kubenswrapper[4834]: E1126 12:39:52.362748 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerName="registry-server" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.362753 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerName="registry-server" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.362925 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc02d4e-1ec3-4149-b00d-00d5db80bf9f" containerName="registry-server" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.364869 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.366802 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.367122 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.367259 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.367301 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.367429 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.376631 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn"] Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.431121 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.431272 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpvd7\" (UniqueName: \"kubernetes.io/projected/5492b71a-1903-424a-a46b-afe0e6713024-kube-api-access-bpvd7\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.431425 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.431551 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.431631 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.533920 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.534018 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.534288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.534389 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpvd7\" (UniqueName: \"kubernetes.io/projected/5492b71a-1903-424a-a46b-afe0e6713024-kube-api-access-bpvd7\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.534490 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.540720 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.540928 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.541076 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.541888 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.551232 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpvd7\" (UniqueName: \"kubernetes.io/projected/5492b71a-1903-424a-a46b-afe0e6713024-kube-api-access-bpvd7\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:52 crc kubenswrapper[4834]: I1126 12:39:52.682629 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:39:53 crc kubenswrapper[4834]: I1126 12:39:53.169738 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn"] Nov 26 12:39:53 crc kubenswrapper[4834]: W1126 12:39:53.173011 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5492b71a_1903_424a_a46b_afe0e6713024.slice/crio-ae956acf19b1bbd44a1fdc1ccee1473d36c09dd05ba32a4382abe2a03c8f8bd2 WatchSource:0}: Error finding container ae956acf19b1bbd44a1fdc1ccee1473d36c09dd05ba32a4382abe2a03c8f8bd2: Status 404 returned error can't find the container with id ae956acf19b1bbd44a1fdc1ccee1473d36c09dd05ba32a4382abe2a03c8f8bd2 Nov 26 12:39:54 crc kubenswrapper[4834]: I1126 12:39:54.190909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" event={"ID":"5492b71a-1903-424a-a46b-afe0e6713024","Type":"ContainerStarted","Data":"2d9b94aa7df211c95dc119ef6cf2d2c3527b3acda1d2535de431f9d777dcd0c5"} Nov 26 12:39:54 crc kubenswrapper[4834]: I1126 12:39:54.191263 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" event={"ID":"5492b71a-1903-424a-a46b-afe0e6713024","Type":"ContainerStarted","Data":"ae956acf19b1bbd44a1fdc1ccee1473d36c09dd05ba32a4382abe2a03c8f8bd2"} Nov 26 12:39:54 crc kubenswrapper[4834]: I1126 12:39:54.211263 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" podStartSLOduration=1.592731855 podStartE2EDuration="2.211242944s" podCreationTimestamp="2025-11-26 12:39:52 +0000 UTC" firstStartedPulling="2025-11-26 12:39:53.174943779 +0000 UTC m=+1691.082157131" lastFinishedPulling="2025-11-26 12:39:53.793454868 +0000 UTC m=+1691.700668220" observedRunningTime="2025-11-26 12:39:54.209056009 +0000 UTC m=+1692.116269362" watchObservedRunningTime="2025-11-26 12:39:54.211242944 +0000 UTC m=+1692.118456296" Nov 26 12:39:57 crc kubenswrapper[4834]: I1126 12:39:57.420154 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:39:57 crc kubenswrapper[4834]: E1126 12:39:57.421856 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:40:03 crc kubenswrapper[4834]: I1126 12:40:03.273713 4834 generic.go:334] "Generic (PLEG): container finished" podID="5492b71a-1903-424a-a46b-afe0e6713024" containerID="2d9b94aa7df211c95dc119ef6cf2d2c3527b3acda1d2535de431f9d777dcd0c5" exitCode=0 Nov 26 12:40:03 crc kubenswrapper[4834]: I1126 12:40:03.273806 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" event={"ID":"5492b71a-1903-424a-a46b-afe0e6713024","Type":"ContainerDied","Data":"2d9b94aa7df211c95dc119ef6cf2d2c3527b3acda1d2535de431f9d777dcd0c5"} Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.616941 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.703826 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ssh-key\") pod \"5492b71a-1903-424a-a46b-afe0e6713024\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.703992 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpvd7\" (UniqueName: \"kubernetes.io/projected/5492b71a-1903-424a-a46b-afe0e6713024-kube-api-access-bpvd7\") pod \"5492b71a-1903-424a-a46b-afe0e6713024\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.704077 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-inventory\") pod \"5492b71a-1903-424a-a46b-afe0e6713024\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.704151 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ceph\") pod \"5492b71a-1903-424a-a46b-afe0e6713024\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.704271 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-repo-setup-combined-ca-bundle\") pod \"5492b71a-1903-424a-a46b-afe0e6713024\" (UID: \"5492b71a-1903-424a-a46b-afe0e6713024\") " Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.709846 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5492b71a-1903-424a-a46b-afe0e6713024-kube-api-access-bpvd7" (OuterVolumeSpecName: "kube-api-access-bpvd7") pod "5492b71a-1903-424a-a46b-afe0e6713024" (UID: "5492b71a-1903-424a-a46b-afe0e6713024"). InnerVolumeSpecName "kube-api-access-bpvd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.710277 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ceph" (OuterVolumeSpecName: "ceph") pod "5492b71a-1903-424a-a46b-afe0e6713024" (UID: "5492b71a-1903-424a-a46b-afe0e6713024"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.713456 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5492b71a-1903-424a-a46b-afe0e6713024" (UID: "5492b71a-1903-424a-a46b-afe0e6713024"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.733179 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5492b71a-1903-424a-a46b-afe0e6713024" (UID: "5492b71a-1903-424a-a46b-afe0e6713024"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.733951 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-inventory" (OuterVolumeSpecName: "inventory") pod "5492b71a-1903-424a-a46b-afe0e6713024" (UID: "5492b71a-1903-424a-a46b-afe0e6713024"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.806468 4834 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.806503 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.806519 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpvd7\" (UniqueName: \"kubernetes.io/projected/5492b71a-1903-424a-a46b-afe0e6713024-kube-api-access-bpvd7\") on node \"crc\" DevicePath \"\"" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.806530 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:40:04 crc kubenswrapper[4834]: I1126 12:40:04.806548 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5492b71a-1903-424a-a46b-afe0e6713024-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.293775 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" event={"ID":"5492b71a-1903-424a-a46b-afe0e6713024","Type":"ContainerDied","Data":"ae956acf19b1bbd44a1fdc1ccee1473d36c09dd05ba32a4382abe2a03c8f8bd2"} Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.293993 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae956acf19b1bbd44a1fdc1ccee1473d36c09dd05ba32a4382abe2a03c8f8bd2" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.293843 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.360040 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466"] Nov 26 12:40:05 crc kubenswrapper[4834]: E1126 12:40:05.360870 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5492b71a-1903-424a-a46b-afe0e6713024" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.360916 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5492b71a-1903-424a-a46b-afe0e6713024" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.361304 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5492b71a-1903-424a-a46b-afe0e6713024" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.362784 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.364654 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.364654 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.365111 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.365209 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.365746 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.371245 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466"] Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.419115 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xphq8\" (UniqueName: \"kubernetes.io/projected/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-kube-api-access-xphq8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.419166 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.419200 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.419480 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.419621 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.520833 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.521398 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.521588 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.521861 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xphq8\" (UniqueName: \"kubernetes.io/projected/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-kube-api-access-xphq8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.522014 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.527492 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.527602 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.527757 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.528667 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.540345 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xphq8\" (UniqueName: \"kubernetes.io/projected/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-kube-api-access-xphq8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z6466\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:05 crc kubenswrapper[4834]: I1126 12:40:05.678129 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:40:06 crc kubenswrapper[4834]: I1126 12:40:06.162590 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466"] Nov 26 12:40:06 crc kubenswrapper[4834]: I1126 12:40:06.305771 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" event={"ID":"d7c68a58-7124-4e32-b9d5-a2d0154c63dd","Type":"ContainerStarted","Data":"0465a63c45924835873960239b8d702968c9ccd6f6615211a26f6da461b747df"} Nov 26 12:40:07 crc kubenswrapper[4834]: I1126 12:40:07.315302 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" event={"ID":"d7c68a58-7124-4e32-b9d5-a2d0154c63dd","Type":"ContainerStarted","Data":"f9c75b059d9156d5d60689513af518d70525b57850e8f0b00f2080a5bbb5e936"} Nov 26 12:40:07 crc kubenswrapper[4834]: I1126 12:40:07.338189 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" podStartSLOduration=1.687340818 podStartE2EDuration="2.338170473s" podCreationTimestamp="2025-11-26 12:40:05 +0000 UTC" firstStartedPulling="2025-11-26 12:40:06.169161374 +0000 UTC m=+1704.076374716" lastFinishedPulling="2025-11-26 12:40:06.819991019 +0000 UTC m=+1704.727204371" observedRunningTime="2025-11-26 12:40:07.332835691 +0000 UTC m=+1705.240049043" watchObservedRunningTime="2025-11-26 12:40:07.338170473 +0000 UTC m=+1705.245383825" Nov 26 12:40:08 crc kubenswrapper[4834]: I1126 12:40:08.417421 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:40:08 crc kubenswrapper[4834]: E1126 12:40:08.417960 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:40:20 crc kubenswrapper[4834]: I1126 12:40:20.417664 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:40:20 crc kubenswrapper[4834]: E1126 12:40:20.418757 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:40:33 crc kubenswrapper[4834]: I1126 12:40:33.416752 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:40:33 crc kubenswrapper[4834]: E1126 12:40:33.417712 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:40:45 crc kubenswrapper[4834]: I1126 12:40:45.712739 4834 scope.go:117] "RemoveContainer" containerID="616698bbc8391482d68210a45c8ea38b76f177bfef0a42a6a9b8bc2e4144faa5" Nov 26 12:40:45 crc kubenswrapper[4834]: I1126 12:40:45.734881 4834 scope.go:117] "RemoveContainer" containerID="c73626f919d0b959d8d0a2a80e0e8f48efe68993cbeb280f32f357b2a4a390a6" Nov 26 12:40:45 crc kubenswrapper[4834]: I1126 12:40:45.801116 4834 scope.go:117] "RemoveContainer" containerID="52ca1883f082639a9a510e3cfa4438a410b3f7514f960125f7adea0d09769623" Nov 26 12:40:45 crc kubenswrapper[4834]: I1126 12:40:45.848426 4834 scope.go:117] "RemoveContainer" containerID="1f0922cc64244aa912e289bebb22a03b4667f43dcd58cc0a33950ad8aecfe95f" Nov 26 12:40:45 crc kubenswrapper[4834]: I1126 12:40:45.882428 4834 scope.go:117] "RemoveContainer" containerID="a54064b1949a728c4a3914fe325f342d908d377366c8e9bc5a601ff54c145873" Nov 26 12:40:47 crc kubenswrapper[4834]: I1126 12:40:47.417051 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:40:47 crc kubenswrapper[4834]: E1126 12:40:47.417980 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:40:58 crc kubenswrapper[4834]: I1126 12:40:58.417368 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:40:58 crc kubenswrapper[4834]: E1126 12:40:58.417929 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:41:13 crc kubenswrapper[4834]: I1126 12:41:13.417447 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:41:13 crc kubenswrapper[4834]: E1126 12:41:13.418057 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:41:25 crc kubenswrapper[4834]: I1126 12:41:25.417363 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:41:25 crc kubenswrapper[4834]: I1126 12:41:25.883068 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"91771d6ddc659e4116cd0d5491ab5dd50d9b99c3c9aec05e2aca8bfb28c92527"} Nov 26 12:41:30 crc kubenswrapper[4834]: I1126 12:41:30.915773 4834 generic.go:334] "Generic (PLEG): container finished" podID="d7c68a58-7124-4e32-b9d5-a2d0154c63dd" containerID="f9c75b059d9156d5d60689513af518d70525b57850e8f0b00f2080a5bbb5e936" exitCode=0 Nov 26 12:41:30 crc kubenswrapper[4834]: I1126 12:41:30.915885 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" event={"ID":"d7c68a58-7124-4e32-b9d5-a2d0154c63dd","Type":"ContainerDied","Data":"f9c75b059d9156d5d60689513af518d70525b57850e8f0b00f2080a5bbb5e936"} Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.205710 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.376425 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ssh-key\") pod \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.377281 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-inventory\") pod \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.379157 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ceph\") pod \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.379218 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-bootstrap-combined-ca-bundle\") pod \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.379330 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xphq8\" (UniqueName: \"kubernetes.io/projected/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-kube-api-access-xphq8\") pod \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\" (UID: \"d7c68a58-7124-4e32-b9d5-a2d0154c63dd\") " Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.383975 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d7c68a58-7124-4e32-b9d5-a2d0154c63dd" (UID: "d7c68a58-7124-4e32-b9d5-a2d0154c63dd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.392119 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ceph" (OuterVolumeSpecName: "ceph") pod "d7c68a58-7124-4e32-b9d5-a2d0154c63dd" (UID: "d7c68a58-7124-4e32-b9d5-a2d0154c63dd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.395658 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-kube-api-access-xphq8" (OuterVolumeSpecName: "kube-api-access-xphq8") pod "d7c68a58-7124-4e32-b9d5-a2d0154c63dd" (UID: "d7c68a58-7124-4e32-b9d5-a2d0154c63dd"). InnerVolumeSpecName "kube-api-access-xphq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.401257 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d7c68a58-7124-4e32-b9d5-a2d0154c63dd" (UID: "d7c68a58-7124-4e32-b9d5-a2d0154c63dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.409561 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-inventory" (OuterVolumeSpecName: "inventory") pod "d7c68a58-7124-4e32-b9d5-a2d0154c63dd" (UID: "d7c68a58-7124-4e32-b9d5-a2d0154c63dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.483846 4834 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.483875 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xphq8\" (UniqueName: \"kubernetes.io/projected/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-kube-api-access-xphq8\") on node \"crc\" DevicePath \"\"" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.483885 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.483894 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.483904 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c68a58-7124-4e32-b9d5-a2d0154c63dd-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.930242 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" event={"ID":"d7c68a58-7124-4e32-b9d5-a2d0154c63dd","Type":"ContainerDied","Data":"0465a63c45924835873960239b8d702968c9ccd6f6615211a26f6da461b747df"} Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.930279 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0465a63c45924835873960239b8d702968c9ccd6f6615211a26f6da461b747df" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.930325 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z6466" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.983917 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r"] Nov 26 12:41:32 crc kubenswrapper[4834]: E1126 12:41:32.984248 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c68a58-7124-4e32-b9d5-a2d0154c63dd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.984267 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c68a58-7124-4e32-b9d5-a2d0154c63dd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.984474 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c68a58-7124-4e32-b9d5-a2d0154c63dd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.985015 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.986750 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.986858 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.986925 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.986958 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.986969 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.988321 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.988351 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.988432 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.988465 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pv9b\" (UniqueName: \"kubernetes.io/projected/0c3d3815-4dbf-4add-a702-d215553b2bbb-kube-api-access-5pv9b\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:32 crc kubenswrapper[4834]: I1126 12:41:32.999075 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r"] Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.090036 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.090503 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.090620 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.090657 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pv9b\" (UniqueName: \"kubernetes.io/projected/0c3d3815-4dbf-4add-a702-d215553b2bbb-kube-api-access-5pv9b\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.093724 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.093747 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.094008 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.104454 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pv9b\" (UniqueName: \"kubernetes.io/projected/0c3d3815-4dbf-4add-a702-d215553b2bbb-kube-api-access-5pv9b\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.300276 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.743843 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r"] Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.749651 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 12:41:33 crc kubenswrapper[4834]: I1126 12:41:33.939859 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" event={"ID":"0c3d3815-4dbf-4add-a702-d215553b2bbb","Type":"ContainerStarted","Data":"fb79de3208d1da89f006966ec717334cb739654e922dfa53532bbdcb4c66c0e2"} Nov 26 12:41:34 crc kubenswrapper[4834]: I1126 12:41:34.948169 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" event={"ID":"0c3d3815-4dbf-4add-a702-d215553b2bbb","Type":"ContainerStarted","Data":"89b08b05869ab6694a23924346f3f92122946b71f9cd79a4a72db3c343362e6f"} Nov 26 12:41:34 crc kubenswrapper[4834]: I1126 12:41:34.965021 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" podStartSLOduration=2.449343029 podStartE2EDuration="2.96500713s" podCreationTimestamp="2025-11-26 12:41:32 +0000 UTC" firstStartedPulling="2025-11-26 12:41:33.749439109 +0000 UTC m=+1791.656652461" lastFinishedPulling="2025-11-26 12:41:34.26510321 +0000 UTC m=+1792.172316562" observedRunningTime="2025-11-26 12:41:34.961754334 +0000 UTC m=+1792.868967687" watchObservedRunningTime="2025-11-26 12:41:34.96500713 +0000 UTC m=+1792.872220482" Nov 26 12:41:45 crc kubenswrapper[4834]: I1126 12:41:45.972006 4834 scope.go:117] "RemoveContainer" containerID="d4e556eb1e52bdad84e09095b481e032082bc42bc2beb56913e1b84939fbf3fa" Nov 26 12:41:46 crc kubenswrapper[4834]: I1126 12:41:46.016265 4834 scope.go:117] "RemoveContainer" containerID="7bbbd9de0f4ed12dbf67a0fe6bcaddcf6aaf829b68393b03879f5c2ef6d7083a" Nov 26 12:41:46 crc kubenswrapper[4834]: I1126 12:41:46.050546 4834 scope.go:117] "RemoveContainer" containerID="337833c4d8253d2415960a7c25de7e080586b7e3e49eedafc9da223167448d74" Nov 26 12:41:52 crc kubenswrapper[4834]: I1126 12:41:52.074984 4834 generic.go:334] "Generic (PLEG): container finished" podID="0c3d3815-4dbf-4add-a702-d215553b2bbb" containerID="89b08b05869ab6694a23924346f3f92122946b71f9cd79a4a72db3c343362e6f" exitCode=0 Nov 26 12:41:52 crc kubenswrapper[4834]: I1126 12:41:52.075072 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" event={"ID":"0c3d3815-4dbf-4add-a702-d215553b2bbb","Type":"ContainerDied","Data":"89b08b05869ab6694a23924346f3f92122946b71f9cd79a4a72db3c343362e6f"} Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.403814 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.527473 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-inventory\") pod \"0c3d3815-4dbf-4add-a702-d215553b2bbb\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.527576 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pv9b\" (UniqueName: \"kubernetes.io/projected/0c3d3815-4dbf-4add-a702-d215553b2bbb-kube-api-access-5pv9b\") pod \"0c3d3815-4dbf-4add-a702-d215553b2bbb\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.528145 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ssh-key\") pod \"0c3d3815-4dbf-4add-a702-d215553b2bbb\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.528647 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ceph\") pod \"0c3d3815-4dbf-4add-a702-d215553b2bbb\" (UID: \"0c3d3815-4dbf-4add-a702-d215553b2bbb\") " Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.534098 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ceph" (OuterVolumeSpecName: "ceph") pod "0c3d3815-4dbf-4add-a702-d215553b2bbb" (UID: "0c3d3815-4dbf-4add-a702-d215553b2bbb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.534816 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3d3815-4dbf-4add-a702-d215553b2bbb-kube-api-access-5pv9b" (OuterVolumeSpecName: "kube-api-access-5pv9b") pod "0c3d3815-4dbf-4add-a702-d215553b2bbb" (UID: "0c3d3815-4dbf-4add-a702-d215553b2bbb"). InnerVolumeSpecName "kube-api-access-5pv9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.551571 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0c3d3815-4dbf-4add-a702-d215553b2bbb" (UID: "0c3d3815-4dbf-4add-a702-d215553b2bbb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.553077 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-inventory" (OuterVolumeSpecName: "inventory") pod "0c3d3815-4dbf-4add-a702-d215553b2bbb" (UID: "0c3d3815-4dbf-4add-a702-d215553b2bbb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.634657 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.635191 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.635204 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3d3815-4dbf-4add-a702-d215553b2bbb-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:41:53 crc kubenswrapper[4834]: I1126 12:41:53.635215 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pv9b\" (UniqueName: \"kubernetes.io/projected/0c3d3815-4dbf-4add-a702-d215553b2bbb-kube-api-access-5pv9b\") on node \"crc\" DevicePath \"\"" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.091536 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" event={"ID":"0c3d3815-4dbf-4add-a702-d215553b2bbb","Type":"ContainerDied","Data":"fb79de3208d1da89f006966ec717334cb739654e922dfa53532bbdcb4c66c0e2"} Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.091581 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb79de3208d1da89f006966ec717334cb739654e922dfa53532bbdcb4c66c0e2" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.091828 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.151910 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db"] Nov 26 12:41:54 crc kubenswrapper[4834]: E1126 12:41:54.152271 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3d3815-4dbf-4add-a702-d215553b2bbb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.152290 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3d3815-4dbf-4add-a702-d215553b2bbb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.152467 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3d3815-4dbf-4add-a702-d215553b2bbb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.153040 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.155437 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.157840 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.158842 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.164346 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.164586 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.192883 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db"] Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.348469 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.348658 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.348825 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbpcg\" (UniqueName: \"kubernetes.io/projected/0b19933f-0934-423b-b0f8-a25f6180446e-kube-api-access-hbpcg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.348876 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.451150 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbpcg\" (UniqueName: \"kubernetes.io/projected/0b19933f-0934-423b-b0f8-a25f6180446e-kube-api-access-hbpcg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.451241 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.451588 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.452002 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.456004 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.456690 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.458130 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.466405 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbpcg\" (UniqueName: \"kubernetes.io/projected/0b19933f-0934-423b-b0f8-a25f6180446e-kube-api-access-hbpcg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6x2db\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.477703 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:41:54 crc kubenswrapper[4834]: I1126 12:41:54.903440 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db"] Nov 26 12:41:54 crc kubenswrapper[4834]: W1126 12:41:54.906526 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b19933f_0934_423b_b0f8_a25f6180446e.slice/crio-46db0813b8bf156fa6f35cfc4e32e0d033ba5247e44c871f6be7418def7fe849 WatchSource:0}: Error finding container 46db0813b8bf156fa6f35cfc4e32e0d033ba5247e44c871f6be7418def7fe849: Status 404 returned error can't find the container with id 46db0813b8bf156fa6f35cfc4e32e0d033ba5247e44c871f6be7418def7fe849 Nov 26 12:41:55 crc kubenswrapper[4834]: I1126 12:41:55.102697 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" event={"ID":"0b19933f-0934-423b-b0f8-a25f6180446e","Type":"ContainerStarted","Data":"46db0813b8bf156fa6f35cfc4e32e0d033ba5247e44c871f6be7418def7fe849"} Nov 26 12:41:56 crc kubenswrapper[4834]: I1126 12:41:56.111423 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" event={"ID":"0b19933f-0934-423b-b0f8-a25f6180446e","Type":"ContainerStarted","Data":"08ccda5a8f7f05b9146e2c01e984a15e73cfd2cc394fa046c6fedf9cec7b0703"} Nov 26 12:41:56 crc kubenswrapper[4834]: I1126 12:41:56.124764 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" podStartSLOduration=1.569068098 podStartE2EDuration="2.124747378s" podCreationTimestamp="2025-11-26 12:41:54 +0000 UTC" firstStartedPulling="2025-11-26 12:41:54.908957078 +0000 UTC m=+1812.816170430" lastFinishedPulling="2025-11-26 12:41:55.464636358 +0000 UTC m=+1813.371849710" observedRunningTime="2025-11-26 12:41:56.123741601 +0000 UTC m=+1814.030954953" watchObservedRunningTime="2025-11-26 12:41:56.124747378 +0000 UTC m=+1814.031960730" Nov 26 12:42:00 crc kubenswrapper[4834]: I1126 12:42:00.139182 4834 generic.go:334] "Generic (PLEG): container finished" podID="0b19933f-0934-423b-b0f8-a25f6180446e" containerID="08ccda5a8f7f05b9146e2c01e984a15e73cfd2cc394fa046c6fedf9cec7b0703" exitCode=0 Nov 26 12:42:00 crc kubenswrapper[4834]: I1126 12:42:00.139342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" event={"ID":"0b19933f-0934-423b-b0f8-a25f6180446e","Type":"ContainerDied","Data":"08ccda5a8f7f05b9146e2c01e984a15e73cfd2cc394fa046c6fedf9cec7b0703"} Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.460114 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.474774 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ssh-key\") pod \"0b19933f-0934-423b-b0f8-a25f6180446e\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.475270 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-inventory\") pod \"0b19933f-0934-423b-b0f8-a25f6180446e\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.475574 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ceph\") pod \"0b19933f-0934-423b-b0f8-a25f6180446e\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.476369 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbpcg\" (UniqueName: \"kubernetes.io/projected/0b19933f-0934-423b-b0f8-a25f6180446e-kube-api-access-hbpcg\") pod \"0b19933f-0934-423b-b0f8-a25f6180446e\" (UID: \"0b19933f-0934-423b-b0f8-a25f6180446e\") " Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.480696 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ceph" (OuterVolumeSpecName: "ceph") pod "0b19933f-0934-423b-b0f8-a25f6180446e" (UID: "0b19933f-0934-423b-b0f8-a25f6180446e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.481355 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b19933f-0934-423b-b0f8-a25f6180446e-kube-api-access-hbpcg" (OuterVolumeSpecName: "kube-api-access-hbpcg") pod "0b19933f-0934-423b-b0f8-a25f6180446e" (UID: "0b19933f-0934-423b-b0f8-a25f6180446e"). InnerVolumeSpecName "kube-api-access-hbpcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.497029 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-inventory" (OuterVolumeSpecName: "inventory") pod "0b19933f-0934-423b-b0f8-a25f6180446e" (UID: "0b19933f-0934-423b-b0f8-a25f6180446e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.497949 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b19933f-0934-423b-b0f8-a25f6180446e" (UID: "0b19933f-0934-423b-b0f8-a25f6180446e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.580293 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.580347 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.580359 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b19933f-0934-423b-b0f8-a25f6180446e-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:01 crc kubenswrapper[4834]: I1126 12:42:01.580375 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbpcg\" (UniqueName: \"kubernetes.io/projected/0b19933f-0934-423b-b0f8-a25f6180446e-kube-api-access-hbpcg\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.154719 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" event={"ID":"0b19933f-0934-423b-b0f8-a25f6180446e","Type":"ContainerDied","Data":"46db0813b8bf156fa6f35cfc4e32e0d033ba5247e44c871f6be7418def7fe849"} Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.154780 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46db0813b8bf156fa6f35cfc4e32e0d033ba5247e44c871f6be7418def7fe849" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.154802 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6x2db" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.209648 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl"] Nov 26 12:42:02 crc kubenswrapper[4834]: E1126 12:42:02.210067 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b19933f-0934-423b-b0f8-a25f6180446e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.210088 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b19933f-0934-423b-b0f8-a25f6180446e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.210275 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b19933f-0934-423b-b0f8-a25f6180446e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.210953 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.213561 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.213638 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.214412 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.214420 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.215648 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.226094 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl"] Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.295108 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.295183 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47nzd\" (UniqueName: \"kubernetes.io/projected/10689f77-1926-4fc1-be46-ab8c79eb3d11-kube-api-access-47nzd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.295213 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.295241 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.398203 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.398292 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47nzd\" (UniqueName: \"kubernetes.io/projected/10689f77-1926-4fc1-be46-ab8c79eb3d11-kube-api-access-47nzd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.398340 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.398376 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.402978 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.403031 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.404649 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.414398 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47nzd\" (UniqueName: \"kubernetes.io/projected/10689f77-1926-4fc1-be46-ab8c79eb3d11-kube-api-access-47nzd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-575pl\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.524859 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:02 crc kubenswrapper[4834]: I1126 12:42:02.949221 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl"] Nov 26 12:42:03 crc kubenswrapper[4834]: I1126 12:42:03.168193 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" event={"ID":"10689f77-1926-4fc1-be46-ab8c79eb3d11","Type":"ContainerStarted","Data":"b3cbc05b690170a36ac9ca7768f81570081527ce62fb28e49bba6f471fe62b27"} Nov 26 12:42:04 crc kubenswrapper[4834]: I1126 12:42:04.176547 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" event={"ID":"10689f77-1926-4fc1-be46-ab8c79eb3d11","Type":"ContainerStarted","Data":"5ae3467e6b263072fe61dbadb91b68ebe4a948fc13d3a9176e2cd80311e6a18e"} Nov 26 12:42:04 crc kubenswrapper[4834]: I1126 12:42:04.195858 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" podStartSLOduration=1.37754211 podStartE2EDuration="2.195843499s" podCreationTimestamp="2025-11-26 12:42:02 +0000 UTC" firstStartedPulling="2025-11-26 12:42:02.951185966 +0000 UTC m=+1820.858399318" lastFinishedPulling="2025-11-26 12:42:03.769487355 +0000 UTC m=+1821.676700707" observedRunningTime="2025-11-26 12:42:04.189648686 +0000 UTC m=+1822.096862038" watchObservedRunningTime="2025-11-26 12:42:04.195843499 +0000 UTC m=+1822.103056851" Nov 26 12:42:28 crc kubenswrapper[4834]: I1126 12:42:28.321895 4834 generic.go:334] "Generic (PLEG): container finished" podID="10689f77-1926-4fc1-be46-ab8c79eb3d11" containerID="5ae3467e6b263072fe61dbadb91b68ebe4a948fc13d3a9176e2cd80311e6a18e" exitCode=0 Nov 26 12:42:28 crc kubenswrapper[4834]: I1126 12:42:28.321988 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" event={"ID":"10689f77-1926-4fc1-be46-ab8c79eb3d11","Type":"ContainerDied","Data":"5ae3467e6b263072fe61dbadb91b68ebe4a948fc13d3a9176e2cd80311e6a18e"} Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.613476 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.800882 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ssh-key\") pod \"10689f77-1926-4fc1-be46-ab8c79eb3d11\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.800944 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47nzd\" (UniqueName: \"kubernetes.io/projected/10689f77-1926-4fc1-be46-ab8c79eb3d11-kube-api-access-47nzd\") pod \"10689f77-1926-4fc1-be46-ab8c79eb3d11\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.801016 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ceph\") pod \"10689f77-1926-4fc1-be46-ab8c79eb3d11\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.801035 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-inventory\") pod \"10689f77-1926-4fc1-be46-ab8c79eb3d11\" (UID: \"10689f77-1926-4fc1-be46-ab8c79eb3d11\") " Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.805707 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10689f77-1926-4fc1-be46-ab8c79eb3d11-kube-api-access-47nzd" (OuterVolumeSpecName: "kube-api-access-47nzd") pod "10689f77-1926-4fc1-be46-ab8c79eb3d11" (UID: "10689f77-1926-4fc1-be46-ab8c79eb3d11"). InnerVolumeSpecName "kube-api-access-47nzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.805782 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ceph" (OuterVolumeSpecName: "ceph") pod "10689f77-1926-4fc1-be46-ab8c79eb3d11" (UID: "10689f77-1926-4fc1-be46-ab8c79eb3d11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.822003 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-inventory" (OuterVolumeSpecName: "inventory") pod "10689f77-1926-4fc1-be46-ab8c79eb3d11" (UID: "10689f77-1926-4fc1-be46-ab8c79eb3d11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.822758 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "10689f77-1926-4fc1-be46-ab8c79eb3d11" (UID: "10689f77-1926-4fc1-be46-ab8c79eb3d11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.902276 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.902305 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.902327 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10689f77-1926-4fc1-be46-ab8c79eb3d11-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:29 crc kubenswrapper[4834]: I1126 12:42:29.902337 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47nzd\" (UniqueName: \"kubernetes.io/projected/10689f77-1926-4fc1-be46-ab8c79eb3d11-kube-api-access-47nzd\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.335831 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" event={"ID":"10689f77-1926-4fc1-be46-ab8c79eb3d11","Type":"ContainerDied","Data":"b3cbc05b690170a36ac9ca7768f81570081527ce62fb28e49bba6f471fe62b27"} Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.336022 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3cbc05b690170a36ac9ca7768f81570081527ce62fb28e49bba6f471fe62b27" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.335871 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-575pl" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.389112 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd"] Nov 26 12:42:30 crc kubenswrapper[4834]: E1126 12:42:30.389473 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10689f77-1926-4fc1-be46-ab8c79eb3d11" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.389491 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="10689f77-1926-4fc1-be46-ab8c79eb3d11" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.389683 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="10689f77-1926-4fc1-be46-ab8c79eb3d11" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.390233 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.394170 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.394216 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.394253 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.394384 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.394598 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.406041 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd"] Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.407863 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.407955 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.407998 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjdr9\" (UniqueName: \"kubernetes.io/projected/85f0a2af-a310-4854-adec-f40a66bb0ba3-kube-api-access-mjdr9\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.408068 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.509414 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.509467 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjdr9\" (UniqueName: \"kubernetes.io/projected/85f0a2af-a310-4854-adec-f40a66bb0ba3-kube-api-access-mjdr9\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.509529 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.509585 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.512856 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.512865 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.513697 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.522921 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjdr9\" (UniqueName: \"kubernetes.io/projected/85f0a2af-a310-4854-adec-f40a66bb0ba3-kube-api-access-mjdr9\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:30 crc kubenswrapper[4834]: I1126 12:42:30.705798 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:31 crc kubenswrapper[4834]: I1126 12:42:31.125499 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd"] Nov 26 12:42:31 crc kubenswrapper[4834]: I1126 12:42:31.342063 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" event={"ID":"85f0a2af-a310-4854-adec-f40a66bb0ba3","Type":"ContainerStarted","Data":"3e069385de044386a06cbdefbc22a44b2d48832f042655a8871851a0926eb6f4"} Nov 26 12:42:32 crc kubenswrapper[4834]: I1126 12:42:32.351133 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" event={"ID":"85f0a2af-a310-4854-adec-f40a66bb0ba3","Type":"ContainerStarted","Data":"61c23d6041c5819cd2fcf175ee96395e32178957da08c4f02dc85d22886a9017"} Nov 26 12:42:32 crc kubenswrapper[4834]: I1126 12:42:32.366713 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" podStartSLOduration=1.8049467080000001 podStartE2EDuration="2.366693741s" podCreationTimestamp="2025-11-26 12:42:30 +0000 UTC" firstStartedPulling="2025-11-26 12:42:31.129854203 +0000 UTC m=+1849.037067556" lastFinishedPulling="2025-11-26 12:42:31.691601237 +0000 UTC m=+1849.598814589" observedRunningTime="2025-11-26 12:42:32.364811051 +0000 UTC m=+1850.272024403" watchObservedRunningTime="2025-11-26 12:42:32.366693741 +0000 UTC m=+1850.273907093" Nov 26 12:42:35 crc kubenswrapper[4834]: I1126 12:42:35.371373 4834 generic.go:334] "Generic (PLEG): container finished" podID="85f0a2af-a310-4854-adec-f40a66bb0ba3" containerID="61c23d6041c5819cd2fcf175ee96395e32178957da08c4f02dc85d22886a9017" exitCode=0 Nov 26 12:42:35 crc kubenswrapper[4834]: I1126 12:42:35.371455 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" event={"ID":"85f0a2af-a310-4854-adec-f40a66bb0ba3","Type":"ContainerDied","Data":"61c23d6041c5819cd2fcf175ee96395e32178957da08c4f02dc85d22886a9017"} Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.689465 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.801980 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-inventory\") pod \"85f0a2af-a310-4854-adec-f40a66bb0ba3\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.802137 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjdr9\" (UniqueName: \"kubernetes.io/projected/85f0a2af-a310-4854-adec-f40a66bb0ba3-kube-api-access-mjdr9\") pod \"85f0a2af-a310-4854-adec-f40a66bb0ba3\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.802235 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ssh-key\") pod \"85f0a2af-a310-4854-adec-f40a66bb0ba3\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.802289 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ceph\") pod \"85f0a2af-a310-4854-adec-f40a66bb0ba3\" (UID: \"85f0a2af-a310-4854-adec-f40a66bb0ba3\") " Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.807599 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f0a2af-a310-4854-adec-f40a66bb0ba3-kube-api-access-mjdr9" (OuterVolumeSpecName: "kube-api-access-mjdr9") pod "85f0a2af-a310-4854-adec-f40a66bb0ba3" (UID: "85f0a2af-a310-4854-adec-f40a66bb0ba3"). InnerVolumeSpecName "kube-api-access-mjdr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.807725 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ceph" (OuterVolumeSpecName: "ceph") pod "85f0a2af-a310-4854-adec-f40a66bb0ba3" (UID: "85f0a2af-a310-4854-adec-f40a66bb0ba3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.824603 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "85f0a2af-a310-4854-adec-f40a66bb0ba3" (UID: "85f0a2af-a310-4854-adec-f40a66bb0ba3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.829250 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-inventory" (OuterVolumeSpecName: "inventory") pod "85f0a2af-a310-4854-adec-f40a66bb0ba3" (UID: "85f0a2af-a310-4854-adec-f40a66bb0ba3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.904236 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.904270 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.904280 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjdr9\" (UniqueName: \"kubernetes.io/projected/85f0a2af-a310-4854-adec-f40a66bb0ba3-kube-api-access-mjdr9\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:36 crc kubenswrapper[4834]: I1126 12:42:36.904288 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f0a2af-a310-4854-adec-f40a66bb0ba3-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.389551 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" event={"ID":"85f0a2af-a310-4854-adec-f40a66bb0ba3","Type":"ContainerDied","Data":"3e069385de044386a06cbdefbc22a44b2d48832f042655a8871851a0926eb6f4"} Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.389595 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.389626 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e069385de044386a06cbdefbc22a44b2d48832f042655a8871851a0926eb6f4" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.441161 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb"] Nov 26 12:42:37 crc kubenswrapper[4834]: E1126 12:42:37.441508 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f0a2af-a310-4854-adec-f40a66bb0ba3" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.441530 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f0a2af-a310-4854-adec-f40a66bb0ba3" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.441731 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f0a2af-a310-4854-adec-f40a66bb0ba3" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.442261 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.444925 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.445695 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.445754 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.445707 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.454936 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.462126 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb"] Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.517456 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7vt4\" (UniqueName: \"kubernetes.io/projected/c9dc53e0-b81a-427b-9459-9cef85f9be65-kube-api-access-j7vt4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.517534 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.518349 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.518588 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.620476 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.620820 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.620885 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7vt4\" (UniqueName: \"kubernetes.io/projected/c9dc53e0-b81a-427b-9459-9cef85f9be65-kube-api-access-j7vt4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.620907 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.624100 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.624164 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.624608 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.637000 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7vt4\" (UniqueName: \"kubernetes.io/projected/c9dc53e0-b81a-427b-9459-9cef85f9be65-kube-api-access-j7vt4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:37 crc kubenswrapper[4834]: I1126 12:42:37.765159 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:42:38 crc kubenswrapper[4834]: I1126 12:42:38.217482 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb"] Nov 26 12:42:38 crc kubenswrapper[4834]: I1126 12:42:38.402235 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" event={"ID":"c9dc53e0-b81a-427b-9459-9cef85f9be65","Type":"ContainerStarted","Data":"52198f7b9a1b8d6608be9bbbb23aad61c0182f01f0cef88c69d3e14382e39af4"} Nov 26 12:42:39 crc kubenswrapper[4834]: I1126 12:42:39.411033 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" event={"ID":"c9dc53e0-b81a-427b-9459-9cef85f9be65","Type":"ContainerStarted","Data":"7dad78302b55dbd6f800f0323826b892b35dc9a77063ea2ddc9ed51a2342947a"} Nov 26 12:42:39 crc kubenswrapper[4834]: I1126 12:42:39.427025 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" podStartSLOduration=1.710600946 podStartE2EDuration="2.427006919s" podCreationTimestamp="2025-11-26 12:42:37 +0000 UTC" firstStartedPulling="2025-11-26 12:42:38.222342234 +0000 UTC m=+1856.129555585" lastFinishedPulling="2025-11-26 12:42:38.938748206 +0000 UTC m=+1856.845961558" observedRunningTime="2025-11-26 12:42:39.422077893 +0000 UTC m=+1857.329291246" watchObservedRunningTime="2025-11-26 12:42:39.427006919 +0000 UTC m=+1857.334220272" Nov 26 12:42:46 crc kubenswrapper[4834]: I1126 12:42:46.124244 4834 scope.go:117] "RemoveContainer" containerID="ab3e8d77415d1f722a91daaa236954cb3c1a052e5e797bc4c56470d258e86cdb" Nov 26 12:42:46 crc kubenswrapper[4834]: I1126 12:42:46.150367 4834 scope.go:117] "RemoveContainer" containerID="e10a2ba759403c41fa78a601b1d66ac0faacccbe4b8e3fcab6a97c044ca3155c" Nov 26 12:43:09 crc kubenswrapper[4834]: I1126 12:43:09.622601 4834 generic.go:334] "Generic (PLEG): container finished" podID="c9dc53e0-b81a-427b-9459-9cef85f9be65" containerID="7dad78302b55dbd6f800f0323826b892b35dc9a77063ea2ddc9ed51a2342947a" exitCode=0 Nov 26 12:43:09 crc kubenswrapper[4834]: I1126 12:43:09.622677 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" event={"ID":"c9dc53e0-b81a-427b-9459-9cef85f9be65","Type":"ContainerDied","Data":"7dad78302b55dbd6f800f0323826b892b35dc9a77063ea2ddc9ed51a2342947a"} Nov 26 12:43:10 crc kubenswrapper[4834]: I1126 12:43:10.947142 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.024130 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ssh-key\") pod \"c9dc53e0-b81a-427b-9459-9cef85f9be65\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.024236 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7vt4\" (UniqueName: \"kubernetes.io/projected/c9dc53e0-b81a-427b-9459-9cef85f9be65-kube-api-access-j7vt4\") pod \"c9dc53e0-b81a-427b-9459-9cef85f9be65\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.024318 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-inventory\") pod \"c9dc53e0-b81a-427b-9459-9cef85f9be65\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.024349 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ceph\") pod \"c9dc53e0-b81a-427b-9459-9cef85f9be65\" (UID: \"c9dc53e0-b81a-427b-9459-9cef85f9be65\") " Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.029465 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ceph" (OuterVolumeSpecName: "ceph") pod "c9dc53e0-b81a-427b-9459-9cef85f9be65" (UID: "c9dc53e0-b81a-427b-9459-9cef85f9be65"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.029669 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9dc53e0-b81a-427b-9459-9cef85f9be65-kube-api-access-j7vt4" (OuterVolumeSpecName: "kube-api-access-j7vt4") pod "c9dc53e0-b81a-427b-9459-9cef85f9be65" (UID: "c9dc53e0-b81a-427b-9459-9cef85f9be65"). InnerVolumeSpecName "kube-api-access-j7vt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.044324 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-inventory" (OuterVolumeSpecName: "inventory") pod "c9dc53e0-b81a-427b-9459-9cef85f9be65" (UID: "c9dc53e0-b81a-427b-9459-9cef85f9be65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.045023 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c9dc53e0-b81a-427b-9459-9cef85f9be65" (UID: "c9dc53e0-b81a-427b-9459-9cef85f9be65"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.127002 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.127033 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.127042 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7vt4\" (UniqueName: \"kubernetes.io/projected/c9dc53e0-b81a-427b-9459-9cef85f9be65-kube-api-access-j7vt4\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.127051 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9dc53e0-b81a-427b-9459-9cef85f9be65-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.636164 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" event={"ID":"c9dc53e0-b81a-427b-9459-9cef85f9be65","Type":"ContainerDied","Data":"52198f7b9a1b8d6608be9bbbb23aad61c0182f01f0cef88c69d3e14382e39af4"} Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.636216 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52198f7b9a1b8d6608be9bbbb23aad61c0182f01f0cef88c69d3e14382e39af4" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.636235 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.706005 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-lzj67"] Nov 26 12:43:11 crc kubenswrapper[4834]: E1126 12:43:11.706466 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dc53e0-b81a-427b-9459-9cef85f9be65" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.706489 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dc53e0-b81a-427b-9459-9cef85f9be65" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.706698 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9dc53e0-b81a-427b-9459-9cef85f9be65" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.707404 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.709406 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.709830 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.709841 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.709894 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.710102 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.713362 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-lzj67"] Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.742061 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ceph\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.742394 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.742448 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.742666 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz95v\" (UniqueName: \"kubernetes.io/projected/3ec594bd-dbbe-4439-b255-798ecb061060-kube-api-access-sz95v\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.844322 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ceph\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.844455 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.844486 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.844537 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz95v\" (UniqueName: \"kubernetes.io/projected/3ec594bd-dbbe-4439-b255-798ecb061060-kube-api-access-sz95v\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.850005 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.850050 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ceph\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.850252 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:11 crc kubenswrapper[4834]: I1126 12:43:11.862634 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz95v\" (UniqueName: \"kubernetes.io/projected/3ec594bd-dbbe-4439-b255-798ecb061060-kube-api-access-sz95v\") pod \"ssh-known-hosts-edpm-deployment-lzj67\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:12 crc kubenswrapper[4834]: I1126 12:43:12.020422 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:12 crc kubenswrapper[4834]: I1126 12:43:12.446732 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-lzj67"] Nov 26 12:43:12 crc kubenswrapper[4834]: I1126 12:43:12.643883 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" event={"ID":"3ec594bd-dbbe-4439-b255-798ecb061060","Type":"ContainerStarted","Data":"cb941e657916cd91c84a63433f569cb4bab1ad63901082a39abd86fcc5006118"} Nov 26 12:43:13 crc kubenswrapper[4834]: I1126 12:43:13.651553 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" event={"ID":"3ec594bd-dbbe-4439-b255-798ecb061060","Type":"ContainerStarted","Data":"7461099cefdfc28a94f257d6d6ebdfa81cc730e8702c756783d3fd9fe63f8fec"} Nov 26 12:43:13 crc kubenswrapper[4834]: I1126 12:43:13.667041 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" podStartSLOduration=2.08009878 podStartE2EDuration="2.66702419s" podCreationTimestamp="2025-11-26 12:43:11 +0000 UTC" firstStartedPulling="2025-11-26 12:43:12.450290602 +0000 UTC m=+1890.357503955" lastFinishedPulling="2025-11-26 12:43:13.037216013 +0000 UTC m=+1890.944429365" observedRunningTime="2025-11-26 12:43:13.664916646 +0000 UTC m=+1891.572129998" watchObservedRunningTime="2025-11-26 12:43:13.66702419 +0000 UTC m=+1891.574237543" Nov 26 12:43:20 crc kubenswrapper[4834]: I1126 12:43:20.693563 4834 generic.go:334] "Generic (PLEG): container finished" podID="3ec594bd-dbbe-4439-b255-798ecb061060" containerID="7461099cefdfc28a94f257d6d6ebdfa81cc730e8702c756783d3fd9fe63f8fec" exitCode=0 Nov 26 12:43:20 crc kubenswrapper[4834]: I1126 12:43:20.693653 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" event={"ID":"3ec594bd-dbbe-4439-b255-798ecb061060","Type":"ContainerDied","Data":"7461099cefdfc28a94f257d6d6ebdfa81cc730e8702c756783d3fd9fe63f8fec"} Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.011824 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.102957 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz95v\" (UniqueName: \"kubernetes.io/projected/3ec594bd-dbbe-4439-b255-798ecb061060-kube-api-access-sz95v\") pod \"3ec594bd-dbbe-4439-b255-798ecb061060\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.103150 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ssh-key-openstack-edpm-ipam\") pod \"3ec594bd-dbbe-4439-b255-798ecb061060\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.103186 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-inventory-0\") pod \"3ec594bd-dbbe-4439-b255-798ecb061060\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.103220 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ceph\") pod \"3ec594bd-dbbe-4439-b255-798ecb061060\" (UID: \"3ec594bd-dbbe-4439-b255-798ecb061060\") " Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.109243 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ceph" (OuterVolumeSpecName: "ceph") pod "3ec594bd-dbbe-4439-b255-798ecb061060" (UID: "3ec594bd-dbbe-4439-b255-798ecb061060"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.109285 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec594bd-dbbe-4439-b255-798ecb061060-kube-api-access-sz95v" (OuterVolumeSpecName: "kube-api-access-sz95v") pod "3ec594bd-dbbe-4439-b255-798ecb061060" (UID: "3ec594bd-dbbe-4439-b255-798ecb061060"). InnerVolumeSpecName "kube-api-access-sz95v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.125560 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3ec594bd-dbbe-4439-b255-798ecb061060" (UID: "3ec594bd-dbbe-4439-b255-798ecb061060"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.125895 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3ec594bd-dbbe-4439-b255-798ecb061060" (UID: "3ec594bd-dbbe-4439-b255-798ecb061060"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.204867 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.204910 4834 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.204920 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ec594bd-dbbe-4439-b255-798ecb061060-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.204929 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz95v\" (UniqueName: \"kubernetes.io/projected/3ec594bd-dbbe-4439-b255-798ecb061060-kube-api-access-sz95v\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.710547 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" event={"ID":"3ec594bd-dbbe-4439-b255-798ecb061060","Type":"ContainerDied","Data":"cb941e657916cd91c84a63433f569cb4bab1ad63901082a39abd86fcc5006118"} Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.710593 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb941e657916cd91c84a63433f569cb4bab1ad63901082a39abd86fcc5006118" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.710646 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-lzj67" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.759552 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq"] Nov 26 12:43:22 crc kubenswrapper[4834]: E1126 12:43:22.759988 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec594bd-dbbe-4439-b255-798ecb061060" containerName="ssh-known-hosts-edpm-deployment" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.760010 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec594bd-dbbe-4439-b255-798ecb061060" containerName="ssh-known-hosts-edpm-deployment" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.760189 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ec594bd-dbbe-4439-b255-798ecb061060" containerName="ssh-known-hosts-edpm-deployment" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.760923 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.762982 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.763147 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.763349 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.765137 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.765293 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.773203 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq"] Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.915846 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t77f\" (UniqueName: \"kubernetes.io/projected/851a4e1d-cfa4-495d-ac61-cce63eff30bc-kube-api-access-2t77f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.916192 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.916234 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:22 crc kubenswrapper[4834]: I1126 12:43:22.916397 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.018633 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.018708 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t77f\" (UniqueName: \"kubernetes.io/projected/851a4e1d-cfa4-495d-ac61-cce63eff30bc-kube-api-access-2t77f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.018778 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.018814 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.023489 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.023536 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.023872 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.032894 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t77f\" (UniqueName: \"kubernetes.io/projected/851a4e1d-cfa4-495d-ac61-cce63eff30bc-kube-api-access-2t77f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4lsrq\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.073813 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.519102 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq"] Nov 26 12:43:23 crc kubenswrapper[4834]: I1126 12:43:23.718352 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" event={"ID":"851a4e1d-cfa4-495d-ac61-cce63eff30bc","Type":"ContainerStarted","Data":"8548776c42a67becf38891a056e53eb909c38c066600f4c3b9693a73cdc41d9e"} Nov 26 12:43:24 crc kubenswrapper[4834]: I1126 12:43:24.730006 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" event={"ID":"851a4e1d-cfa4-495d-ac61-cce63eff30bc","Type":"ContainerStarted","Data":"158cbb82ff98706b6c441e7df1029042a853d2794c11d852380617e0d9bc107f"} Nov 26 12:43:30 crc kubenswrapper[4834]: I1126 12:43:30.774111 4834 generic.go:334] "Generic (PLEG): container finished" podID="851a4e1d-cfa4-495d-ac61-cce63eff30bc" containerID="158cbb82ff98706b6c441e7df1029042a853d2794c11d852380617e0d9bc107f" exitCode=0 Nov 26 12:43:30 crc kubenswrapper[4834]: I1126 12:43:30.774157 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" event={"ID":"851a4e1d-cfa4-495d-ac61-cce63eff30bc","Type":"ContainerDied","Data":"158cbb82ff98706b6c441e7df1029042a853d2794c11d852380617e0d9bc107f"} Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.076797 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.161556 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ssh-key\") pod \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.181657 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "851a4e1d-cfa4-495d-ac61-cce63eff30bc" (UID: "851a4e1d-cfa4-495d-ac61-cce63eff30bc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.262637 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-inventory\") pod \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.262690 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t77f\" (UniqueName: \"kubernetes.io/projected/851a4e1d-cfa4-495d-ac61-cce63eff30bc-kube-api-access-2t77f\") pod \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.262746 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ceph\") pod \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\" (UID: \"851a4e1d-cfa4-495d-ac61-cce63eff30bc\") " Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.262995 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.266395 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/851a4e1d-cfa4-495d-ac61-cce63eff30bc-kube-api-access-2t77f" (OuterVolumeSpecName: "kube-api-access-2t77f") pod "851a4e1d-cfa4-495d-ac61-cce63eff30bc" (UID: "851a4e1d-cfa4-495d-ac61-cce63eff30bc"). InnerVolumeSpecName "kube-api-access-2t77f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.267704 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ceph" (OuterVolumeSpecName: "ceph") pod "851a4e1d-cfa4-495d-ac61-cce63eff30bc" (UID: "851a4e1d-cfa4-495d-ac61-cce63eff30bc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.280299 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-inventory" (OuterVolumeSpecName: "inventory") pod "851a4e1d-cfa4-495d-ac61-cce63eff30bc" (UID: "851a4e1d-cfa4-495d-ac61-cce63eff30bc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.365031 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.365063 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t77f\" (UniqueName: \"kubernetes.io/projected/851a4e1d-cfa4-495d-ac61-cce63eff30bc-kube-api-access-2t77f\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.365073 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/851a4e1d-cfa4-495d-ac61-cce63eff30bc-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.789474 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" event={"ID":"851a4e1d-cfa4-495d-ac61-cce63eff30bc","Type":"ContainerDied","Data":"8548776c42a67becf38891a056e53eb909c38c066600f4c3b9693a73cdc41d9e"} Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.789700 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8548776c42a67becf38891a056e53eb909c38c066600f4c3b9693a73cdc41d9e" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.789527 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4lsrq" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.855951 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp"] Nov 26 12:43:32 crc kubenswrapper[4834]: E1126 12:43:32.856387 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="851a4e1d-cfa4-495d-ac61-cce63eff30bc" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.856418 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="851a4e1d-cfa4-495d-ac61-cce63eff30bc" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.856611 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="851a4e1d-cfa4-495d-ac61-cce63eff30bc" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.857258 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.859194 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.859267 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.859723 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.859778 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.860005 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.862381 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp"] Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.878506 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.878573 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.878605 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.878636 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-kube-api-access-8jlm5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.979981 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.980044 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.980071 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.980094 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-kube-api-access-8jlm5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.983547 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.984058 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.984137 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:32 crc kubenswrapper[4834]: I1126 12:43:32.994933 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-kube-api-access-8jlm5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-686bp\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:33 crc kubenswrapper[4834]: I1126 12:43:33.178432 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:33 crc kubenswrapper[4834]: I1126 12:43:33.594649 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp"] Nov 26 12:43:33 crc kubenswrapper[4834]: I1126 12:43:33.797356 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" event={"ID":"d0280880-1382-46e5-9c25-f9c3b9d8f3bd","Type":"ContainerStarted","Data":"655d9a3ca54c6b2840a501b42f8d6f75980886980a010945b8c623ebcabc530f"} Nov 26 12:43:34 crc kubenswrapper[4834]: I1126 12:43:34.805467 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" event={"ID":"d0280880-1382-46e5-9c25-f9c3b9d8f3bd","Type":"ContainerStarted","Data":"4babf6700a74cc64029db723aa4a39aced6bdd04a9c4e43c9ded741534a80b82"} Nov 26 12:43:34 crc kubenswrapper[4834]: I1126 12:43:34.821664 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" podStartSLOduration=2.076196035 podStartE2EDuration="2.821652077s" podCreationTimestamp="2025-11-26 12:43:32 +0000 UTC" firstStartedPulling="2025-11-26 12:43:33.598348948 +0000 UTC m=+1911.505562300" lastFinishedPulling="2025-11-26 12:43:34.34380499 +0000 UTC m=+1912.251018342" observedRunningTime="2025-11-26 12:43:34.818366951 +0000 UTC m=+1912.725580304" watchObservedRunningTime="2025-11-26 12:43:34.821652077 +0000 UTC m=+1912.728865429" Nov 26 12:43:41 crc kubenswrapper[4834]: I1126 12:43:41.851621 4834 generic.go:334] "Generic (PLEG): container finished" podID="d0280880-1382-46e5-9c25-f9c3b9d8f3bd" containerID="4babf6700a74cc64029db723aa4a39aced6bdd04a9c4e43c9ded741534a80b82" exitCode=0 Nov 26 12:43:41 crc kubenswrapper[4834]: I1126 12:43:41.851727 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" event={"ID":"d0280880-1382-46e5-9c25-f9c3b9d8f3bd","Type":"ContainerDied","Data":"4babf6700a74cc64029db723aa4a39aced6bdd04a9c4e43c9ded741534a80b82"} Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.144166 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.342279 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-inventory\") pod \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.342369 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-kube-api-access-8jlm5\") pod \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.342438 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ssh-key\") pod \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.342468 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ceph\") pod \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\" (UID: \"d0280880-1382-46e5-9c25-f9c3b9d8f3bd\") " Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.347018 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-kube-api-access-8jlm5" (OuterVolumeSpecName: "kube-api-access-8jlm5") pod "d0280880-1382-46e5-9c25-f9c3b9d8f3bd" (UID: "d0280880-1382-46e5-9c25-f9c3b9d8f3bd"). InnerVolumeSpecName "kube-api-access-8jlm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.347097 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ceph" (OuterVolumeSpecName: "ceph") pod "d0280880-1382-46e5-9c25-f9c3b9d8f3bd" (UID: "d0280880-1382-46e5-9c25-f9c3b9d8f3bd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.361857 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-inventory" (OuterVolumeSpecName: "inventory") pod "d0280880-1382-46e5-9c25-f9c3b9d8f3bd" (UID: "d0280880-1382-46e5-9c25-f9c3b9d8f3bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.362665 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d0280880-1382-46e5-9c25-f9c3b9d8f3bd" (UID: "d0280880-1382-46e5-9c25-f9c3b9d8f3bd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.444750 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.444773 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-kube-api-access-8jlm5\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.444781 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.444789 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0280880-1382-46e5-9c25-f9c3b9d8f3bd-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.864693 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" event={"ID":"d0280880-1382-46e5-9c25-f9c3b9d8f3bd","Type":"ContainerDied","Data":"655d9a3ca54c6b2840a501b42f8d6f75980886980a010945b8c623ebcabc530f"} Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.864914 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="655d9a3ca54c6b2840a501b42f8d6f75980886980a010945b8c623ebcabc530f" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.864768 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-686bp" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.925259 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr"] Nov 26 12:43:43 crc kubenswrapper[4834]: E1126 12:43:43.925627 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0280880-1382-46e5-9c25-f9c3b9d8f3bd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.925644 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0280880-1382-46e5-9c25-f9c3b9d8f3bd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.925791 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0280880-1382-46e5-9c25-f9c3b9d8f3bd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.926345 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.928382 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.928553 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.928634 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.928806 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.929018 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.929086 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.929513 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.930474 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.934532 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr"] Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952089 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952129 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952165 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952206 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952226 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952247 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952302 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952379 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952431 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952475 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952500 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pd4g\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-kube-api-access-8pd4g\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952519 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:43 crc kubenswrapper[4834]: I1126 12:43:43.952549 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.053730 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.053793 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.053835 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.053854 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pd4g\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-kube-api-access-8pd4g\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.053869 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.053896 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.053920 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.053936 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.054007 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.054043 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.054060 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.054077 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.054094 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.060539 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.064948 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.065125 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.065156 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.066494 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.074947 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.077867 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.078884 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.080820 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.082645 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.084863 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pd4g\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-kube-api-access-8pd4g\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.086079 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.102246 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.252404 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.664007 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr"] Nov 26 12:43:44 crc kubenswrapper[4834]: I1126 12:43:44.871879 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" event={"ID":"797b63c9-b38e-407e-b2a2-8999021770ce","Type":"ContainerStarted","Data":"8610c58b7b4317b3513ab62d49aae98d143280f536fca141caffbe2f4fb830af"} Nov 26 12:43:45 crc kubenswrapper[4834]: I1126 12:43:45.879565 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" event={"ID":"797b63c9-b38e-407e-b2a2-8999021770ce","Type":"ContainerStarted","Data":"a78bb04e6edeb0522042d550ed41af8f5626b6cab79dd449c2bcfb5497c7933c"} Nov 26 12:43:51 crc kubenswrapper[4834]: I1126 12:43:51.531461 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:43:51 crc kubenswrapper[4834]: I1126 12:43:51.532085 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:44:09 crc kubenswrapper[4834]: I1126 12:44:09.043685 4834 generic.go:334] "Generic (PLEG): container finished" podID="797b63c9-b38e-407e-b2a2-8999021770ce" containerID="a78bb04e6edeb0522042d550ed41af8f5626b6cab79dd449c2bcfb5497c7933c" exitCode=0 Nov 26 12:44:09 crc kubenswrapper[4834]: I1126 12:44:09.043768 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" event={"ID":"797b63c9-b38e-407e-b2a2-8999021770ce","Type":"ContainerDied","Data":"a78bb04e6edeb0522042d550ed41af8f5626b6cab79dd449c2bcfb5497c7933c"} Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.357149 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.508818 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.508886 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-repo-setup-combined-ca-bundle\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509074 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-libvirt-combined-ca-bundle\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509235 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-inventory\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509268 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509341 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-ovn-default-certs-0\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509402 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-neutron-metadata-combined-ca-bundle\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509440 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ceph\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509503 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ovn-combined-ca-bundle\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509548 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-bootstrap-combined-ca-bundle\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509604 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-nova-combined-ca-bundle\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509641 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pd4g\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-kube-api-access-8pd4g\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.509662 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ssh-key\") pod \"797b63c9-b38e-407e-b2a2-8999021770ce\" (UID: \"797b63c9-b38e-407e-b2a2-8999021770ce\") " Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.515892 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-kube-api-access-8pd4g" (OuterVolumeSpecName: "kube-api-access-8pd4g") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "kube-api-access-8pd4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.516785 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.516911 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.516984 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.517883 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ceph" (OuterVolumeSpecName: "ceph") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.518684 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.518803 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.519811 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.519834 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.519956 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.521862 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.535602 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.536565 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-inventory" (OuterVolumeSpecName: "inventory") pod "797b63c9-b38e-407e-b2a2-8999021770ce" (UID: "797b63c9-b38e-407e-b2a2-8999021770ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612819 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612848 4834 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612861 4834 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612871 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612880 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612890 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612901 4834 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612910 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612920 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612931 4834 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612941 4834 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612950 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pd4g\" (UniqueName: \"kubernetes.io/projected/797b63c9-b38e-407e-b2a2-8999021770ce-kube-api-access-8pd4g\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:10 crc kubenswrapper[4834]: I1126 12:44:10.612958 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/797b63c9-b38e-407e-b2a2-8999021770ce-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.062992 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" event={"ID":"797b63c9-b38e-407e-b2a2-8999021770ce","Type":"ContainerDied","Data":"8610c58b7b4317b3513ab62d49aae98d143280f536fca141caffbe2f4fb830af"} Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.063054 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8610c58b7b4317b3513ab62d49aae98d143280f536fca141caffbe2f4fb830af" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.063072 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.211093 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br"] Nov 26 12:44:11 crc kubenswrapper[4834]: E1126 12:44:11.211471 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797b63c9-b38e-407e-b2a2-8999021770ce" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.211489 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="797b63c9-b38e-407e-b2a2-8999021770ce" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.211662 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="797b63c9-b38e-407e-b2a2-8999021770ce" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.212210 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.215151 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.215151 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.215153 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.215625 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.218477 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.218875 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br"] Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.222451 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.222553 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.222714 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.222797 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrq8\" (UniqueName: \"kubernetes.io/projected/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-kube-api-access-ndrq8\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.324704 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.324813 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.324873 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrq8\" (UniqueName: \"kubernetes.io/projected/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-kube-api-access-ndrq8\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.324904 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.330098 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.330774 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.330769 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.339034 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrq8\" (UniqueName: \"kubernetes.io/projected/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-kube-api-access-ndrq8\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-j49br\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.525022 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:11 crc kubenswrapper[4834]: I1126 12:44:11.936067 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br"] Nov 26 12:44:12 crc kubenswrapper[4834]: I1126 12:44:12.073269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" event={"ID":"f3175863-4ff8-4c14-82c8-2a1ae6bccf38","Type":"ContainerStarted","Data":"8154c17551a0d481a7c04644370c3cb929e6757da8f68a0e7c9f4b3ac14eff6b"} Nov 26 12:44:13 crc kubenswrapper[4834]: I1126 12:44:13.081937 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" event={"ID":"f3175863-4ff8-4c14-82c8-2a1ae6bccf38","Type":"ContainerStarted","Data":"fdeea78013a09cb3e03556a8e2c131f8e74722bf63a3124d5da0c74a42918ef9"} Nov 26 12:44:13 crc kubenswrapper[4834]: I1126 12:44:13.104060 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" podStartSLOduration=1.512536512 podStartE2EDuration="2.104037481s" podCreationTimestamp="2025-11-26 12:44:11 +0000 UTC" firstStartedPulling="2025-11-26 12:44:11.941409136 +0000 UTC m=+1949.848622489" lastFinishedPulling="2025-11-26 12:44:12.532910096 +0000 UTC m=+1950.440123458" observedRunningTime="2025-11-26 12:44:13.095818671 +0000 UTC m=+1951.003032022" watchObservedRunningTime="2025-11-26 12:44:13.104037481 +0000 UTC m=+1951.011250833" Nov 26 12:44:17 crc kubenswrapper[4834]: I1126 12:44:17.113360 4834 generic.go:334] "Generic (PLEG): container finished" podID="f3175863-4ff8-4c14-82c8-2a1ae6bccf38" containerID="fdeea78013a09cb3e03556a8e2c131f8e74722bf63a3124d5da0c74a42918ef9" exitCode=0 Nov 26 12:44:17 crc kubenswrapper[4834]: I1126 12:44:17.113455 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" event={"ID":"f3175863-4ff8-4c14-82c8-2a1ae6bccf38","Type":"ContainerDied","Data":"fdeea78013a09cb3e03556a8e2c131f8e74722bf63a3124d5da0c74a42918ef9"} Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.445922 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.555734 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ssh-key\") pod \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.556201 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-inventory\") pod \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.556304 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ceph\") pod \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.556433 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndrq8\" (UniqueName: \"kubernetes.io/projected/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-kube-api-access-ndrq8\") pod \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\" (UID: \"f3175863-4ff8-4c14-82c8-2a1ae6bccf38\") " Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.562401 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ceph" (OuterVolumeSpecName: "ceph") pod "f3175863-4ff8-4c14-82c8-2a1ae6bccf38" (UID: "f3175863-4ff8-4c14-82c8-2a1ae6bccf38"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.562427 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-kube-api-access-ndrq8" (OuterVolumeSpecName: "kube-api-access-ndrq8") pod "f3175863-4ff8-4c14-82c8-2a1ae6bccf38" (UID: "f3175863-4ff8-4c14-82c8-2a1ae6bccf38"). InnerVolumeSpecName "kube-api-access-ndrq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.578079 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-inventory" (OuterVolumeSpecName: "inventory") pod "f3175863-4ff8-4c14-82c8-2a1ae6bccf38" (UID: "f3175863-4ff8-4c14-82c8-2a1ae6bccf38"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.582735 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f3175863-4ff8-4c14-82c8-2a1ae6bccf38" (UID: "f3175863-4ff8-4c14-82c8-2a1ae6bccf38"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.660374 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.660407 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.660424 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndrq8\" (UniqueName: \"kubernetes.io/projected/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-kube-api-access-ndrq8\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:18 crc kubenswrapper[4834]: I1126 12:44:18.660436 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3175863-4ff8-4c14-82c8-2a1ae6bccf38-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.132091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" event={"ID":"f3175863-4ff8-4c14-82c8-2a1ae6bccf38","Type":"ContainerDied","Data":"8154c17551a0d481a7c04644370c3cb929e6757da8f68a0e7c9f4b3ac14eff6b"} Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.132150 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8154c17551a0d481a7c04644370c3cb929e6757da8f68a0e7c9f4b3ac14eff6b" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.132183 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-j49br" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.218085 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4"] Nov 26 12:44:19 crc kubenswrapper[4834]: E1126 12:44:19.218690 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3175863-4ff8-4c14-82c8-2a1ae6bccf38" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.218721 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3175863-4ff8-4c14-82c8-2a1ae6bccf38" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.218903 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3175863-4ff8-4c14-82c8-2a1ae6bccf38" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.219593 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.221935 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.221963 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.222588 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.222715 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.222815 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.226886 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.228857 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4"] Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.274115 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.274433 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.274481 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.274667 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd5b6\" (UniqueName: \"kubernetes.io/projected/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-kube-api-access-wd5b6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.274887 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.274998 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.377023 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd5b6\" (UniqueName: \"kubernetes.io/projected/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-kube-api-access-wd5b6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.377086 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.377117 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.377185 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.377254 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.377283 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.378059 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.380679 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.380687 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.381056 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.381520 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.392909 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd5b6\" (UniqueName: \"kubernetes.io/projected/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-kube-api-access-wd5b6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m2cd4\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.534722 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:44:19 crc kubenswrapper[4834]: I1126 12:44:19.984037 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4"] Nov 26 12:44:20 crc kubenswrapper[4834]: I1126 12:44:20.139628 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" event={"ID":"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6","Type":"ContainerStarted","Data":"a4b0f414cb39b421006488a70fcb7c3b65093caefe45004b48ae74441d1b07b5"} Nov 26 12:44:21 crc kubenswrapper[4834]: I1126 12:44:21.149016 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" event={"ID":"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6","Type":"ContainerStarted","Data":"53806cfb1c47c466f9db92015b68f5f7229d18714cc5f485c5801da667072e37"} Nov 26 12:44:21 crc kubenswrapper[4834]: I1126 12:44:21.170302 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" podStartSLOduration=1.489013521 podStartE2EDuration="2.170287474s" podCreationTimestamp="2025-11-26 12:44:19 +0000 UTC" firstStartedPulling="2025-11-26 12:44:19.989868646 +0000 UTC m=+1957.897081998" lastFinishedPulling="2025-11-26 12:44:20.671142598 +0000 UTC m=+1958.578355951" observedRunningTime="2025-11-26 12:44:21.167036272 +0000 UTC m=+1959.074249623" watchObservedRunningTime="2025-11-26 12:44:21.170287474 +0000 UTC m=+1959.077500826" Nov 26 12:44:21 crc kubenswrapper[4834]: I1126 12:44:21.531225 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:44:21 crc kubenswrapper[4834]: I1126 12:44:21.531289 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:44:51 crc kubenswrapper[4834]: I1126 12:44:51.531121 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:44:51 crc kubenswrapper[4834]: I1126 12:44:51.531736 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:44:51 crc kubenswrapper[4834]: I1126 12:44:51.531783 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:44:51 crc kubenswrapper[4834]: I1126 12:44:51.532345 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91771d6ddc659e4116cd0d5491ab5dd50d9b99c3c9aec05e2aca8bfb28c92527"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:44:51 crc kubenswrapper[4834]: I1126 12:44:51.532396 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://91771d6ddc659e4116cd0d5491ab5dd50d9b99c3c9aec05e2aca8bfb28c92527" gracePeriod=600 Nov 26 12:44:52 crc kubenswrapper[4834]: I1126 12:44:52.365244 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="91771d6ddc659e4116cd0d5491ab5dd50d9b99c3c9aec05e2aca8bfb28c92527" exitCode=0 Nov 26 12:44:52 crc kubenswrapper[4834]: I1126 12:44:52.365305 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"91771d6ddc659e4116cd0d5491ab5dd50d9b99c3c9aec05e2aca8bfb28c92527"} Nov 26 12:44:52 crc kubenswrapper[4834]: I1126 12:44:52.365601 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88"} Nov 26 12:44:52 crc kubenswrapper[4834]: I1126 12:44:52.365631 4834 scope.go:117] "RemoveContainer" containerID="aba7bd5109d66c03d1488a442864455b22766070ee45bc78cc9856eb8e10f2d8" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.131328 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks"] Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.132797 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.134021 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.134293 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.137206 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks"] Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.228130 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3c47406-824a-4a88-a637-3a8dd2e86034-secret-volume\") pod \"collect-profiles-29402685-4x5ks\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.228197 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24kkt\" (UniqueName: \"kubernetes.io/projected/c3c47406-824a-4a88-a637-3a8dd2e86034-kube-api-access-24kkt\") pod \"collect-profiles-29402685-4x5ks\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.228510 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3c47406-824a-4a88-a637-3a8dd2e86034-config-volume\") pod \"collect-profiles-29402685-4x5ks\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.330293 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3c47406-824a-4a88-a637-3a8dd2e86034-secret-volume\") pod \"collect-profiles-29402685-4x5ks\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.330391 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24kkt\" (UniqueName: \"kubernetes.io/projected/c3c47406-824a-4a88-a637-3a8dd2e86034-kube-api-access-24kkt\") pod \"collect-profiles-29402685-4x5ks\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.330493 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3c47406-824a-4a88-a637-3a8dd2e86034-config-volume\") pod \"collect-profiles-29402685-4x5ks\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.331244 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3c47406-824a-4a88-a637-3a8dd2e86034-config-volume\") pod \"collect-profiles-29402685-4x5ks\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.334516 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3c47406-824a-4a88-a637-3a8dd2e86034-secret-volume\") pod \"collect-profiles-29402685-4x5ks\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.356383 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24kkt\" (UniqueName: \"kubernetes.io/projected/c3c47406-824a-4a88-a637-3a8dd2e86034-kube-api-access-24kkt\") pod \"collect-profiles-29402685-4x5ks\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.446536 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:00 crc kubenswrapper[4834]: I1126 12:45:00.814042 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks"] Nov 26 12:45:01 crc kubenswrapper[4834]: I1126 12:45:01.422176 4834 generic.go:334] "Generic (PLEG): container finished" podID="c3c47406-824a-4a88-a637-3a8dd2e86034" containerID="7cd1d537e9e72fc747b339a11811f04971d03a5ade9b1d91de2093b7f1810b4f" exitCode=0 Nov 26 12:45:01 crc kubenswrapper[4834]: I1126 12:45:01.422219 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" event={"ID":"c3c47406-824a-4a88-a637-3a8dd2e86034","Type":"ContainerDied","Data":"7cd1d537e9e72fc747b339a11811f04971d03a5ade9b1d91de2093b7f1810b4f"} Nov 26 12:45:01 crc kubenswrapper[4834]: I1126 12:45:01.422408 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" event={"ID":"c3c47406-824a-4a88-a637-3a8dd2e86034","Type":"ContainerStarted","Data":"0e8eec3c9fd4a6b12de9ec3d8f35cf39fef60cf603ae7795782cb9065fbe7574"} Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.665126 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.765403 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24kkt\" (UniqueName: \"kubernetes.io/projected/c3c47406-824a-4a88-a637-3a8dd2e86034-kube-api-access-24kkt\") pod \"c3c47406-824a-4a88-a637-3a8dd2e86034\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.765502 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3c47406-824a-4a88-a637-3a8dd2e86034-config-volume\") pod \"c3c47406-824a-4a88-a637-3a8dd2e86034\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.765532 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3c47406-824a-4a88-a637-3a8dd2e86034-secret-volume\") pod \"c3c47406-824a-4a88-a637-3a8dd2e86034\" (UID: \"c3c47406-824a-4a88-a637-3a8dd2e86034\") " Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.766036 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c47406-824a-4a88-a637-3a8dd2e86034-config-volume" (OuterVolumeSpecName: "config-volume") pod "c3c47406-824a-4a88-a637-3a8dd2e86034" (UID: "c3c47406-824a-4a88-a637-3a8dd2e86034"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.769771 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c47406-824a-4a88-a637-3a8dd2e86034-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c3c47406-824a-4a88-a637-3a8dd2e86034" (UID: "c3c47406-824a-4a88-a637-3a8dd2e86034"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.770126 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c47406-824a-4a88-a637-3a8dd2e86034-kube-api-access-24kkt" (OuterVolumeSpecName: "kube-api-access-24kkt") pod "c3c47406-824a-4a88-a637-3a8dd2e86034" (UID: "c3c47406-824a-4a88-a637-3a8dd2e86034"). InnerVolumeSpecName "kube-api-access-24kkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.867661 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24kkt\" (UniqueName: \"kubernetes.io/projected/c3c47406-824a-4a88-a637-3a8dd2e86034-kube-api-access-24kkt\") on node \"crc\" DevicePath \"\"" Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.867698 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3c47406-824a-4a88-a637-3a8dd2e86034-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 12:45:02 crc kubenswrapper[4834]: I1126 12:45:02.867707 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3c47406-824a-4a88-a637-3a8dd2e86034-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 12:45:03 crc kubenswrapper[4834]: I1126 12:45:03.435381 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" event={"ID":"c3c47406-824a-4a88-a637-3a8dd2e86034","Type":"ContainerDied","Data":"0e8eec3c9fd4a6b12de9ec3d8f35cf39fef60cf603ae7795782cb9065fbe7574"} Nov 26 12:45:03 crc kubenswrapper[4834]: I1126 12:45:03.435604 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e8eec3c9fd4a6b12de9ec3d8f35cf39fef60cf603ae7795782cb9065fbe7574" Nov 26 12:45:03 crc kubenswrapper[4834]: I1126 12:45:03.435441 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402685-4x5ks" Nov 26 12:45:03 crc kubenswrapper[4834]: I1126 12:45:03.717326 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz"] Nov 26 12:45:03 crc kubenswrapper[4834]: I1126 12:45:03.722145 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402640-25hcz"] Nov 26 12:45:04 crc kubenswrapper[4834]: I1126 12:45:04.426242 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280cee19-adbf-4307-ac10-337b76f6b6d1" path="/var/lib/kubelet/pods/280cee19-adbf-4307-ac10-337b76f6b6d1/volumes" Nov 26 12:45:13 crc kubenswrapper[4834]: I1126 12:45:13.491362 4834 generic.go:334] "Generic (PLEG): container finished" podID="21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" containerID="53806cfb1c47c466f9db92015b68f5f7229d18714cc5f485c5801da667072e37" exitCode=0 Nov 26 12:45:13 crc kubenswrapper[4834]: I1126 12:45:13.491451 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" event={"ID":"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6","Type":"ContainerDied","Data":"53806cfb1c47c466f9db92015b68f5f7229d18714cc5f485c5801da667072e37"} Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.784042 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.925870 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ceph\") pod \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.925982 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ssh-key\") pod \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.926087 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovn-combined-ca-bundle\") pod \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.926125 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-inventory\") pod \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.926204 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovncontroller-config-0\") pod \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.926233 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd5b6\" (UniqueName: \"kubernetes.io/projected/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-kube-api-access-wd5b6\") pod \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.930599 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ceph" (OuterVolumeSpecName: "ceph") pod "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" (UID: "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.930796 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" (UID: "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.931020 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-kube-api-access-wd5b6" (OuterVolumeSpecName: "kube-api-access-wd5b6") pod "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" (UID: "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6"). InnerVolumeSpecName "kube-api-access-wd5b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.944171 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" (UID: "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:45:14 crc kubenswrapper[4834]: E1126 12:45:14.944554 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-inventory podName:21eb26c7-797e-4fa5-aa7f-b9acc565e6b6 nodeName:}" failed. No retries permitted until 2025-11-26 12:45:15.444533315 +0000 UTC m=+2013.351746667 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-inventory") pod "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" (UID: "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6") : error deleting /var/lib/kubelet/pods/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6/volume-subpaths: remove /var/lib/kubelet/pods/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6/volume-subpaths: no such file or directory Nov 26 12:45:14 crc kubenswrapper[4834]: I1126 12:45:14.946514 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" (UID: "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.027760 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.027787 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.027801 4834 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.027812 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd5b6\" (UniqueName: \"kubernetes.io/projected/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-kube-api-access-wd5b6\") on node \"crc\" DevicePath \"\"" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.027820 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.504540 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" event={"ID":"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6","Type":"ContainerDied","Data":"a4b0f414cb39b421006488a70fcb7c3b65093caefe45004b48ae74441d1b07b5"} Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.504585 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b0f414cb39b421006488a70fcb7c3b65093caefe45004b48ae74441d1b07b5" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.504592 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m2cd4" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.532637 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-inventory\") pod \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\" (UID: \"21eb26c7-797e-4fa5-aa7f-b9acc565e6b6\") " Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.535406 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-inventory" (OuterVolumeSpecName: "inventory") pod "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" (UID: "21eb26c7-797e-4fa5-aa7f-b9acc565e6b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.563913 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm"] Nov 26 12:45:15 crc kubenswrapper[4834]: E1126 12:45:15.564257 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c47406-824a-4a88-a637-3a8dd2e86034" containerName="collect-profiles" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.564275 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c47406-824a-4a88-a637-3a8dd2e86034" containerName="collect-profiles" Nov 26 12:45:15 crc kubenswrapper[4834]: E1126 12:45:15.564298 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.564304 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.564464 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c47406-824a-4a88-a637-3a8dd2e86034" containerName="collect-profiles" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.564482 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="21eb26c7-797e-4fa5-aa7f-b9acc565e6b6" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.565022 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.568560 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.568912 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.576835 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm"] Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.634482 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.634586 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.634614 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.634642 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.634745 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv4vd\" (UniqueName: \"kubernetes.io/projected/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-kube-api-access-nv4vd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.634797 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.634823 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.634931 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21eb26c7-797e-4fa5-aa7f-b9acc565e6b6-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.735885 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.735950 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.736005 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.736027 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.736054 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.736086 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv4vd\" (UniqueName: \"kubernetes.io/projected/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-kube-api-access-nv4vd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.736114 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.738999 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.738998 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.739096 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.739840 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.740053 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.740484 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.751684 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv4vd\" (UniqueName: \"kubernetes.io/projected/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-kube-api-access-nv4vd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:15 crc kubenswrapper[4834]: I1126 12:45:15.882479 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:45:16 crc kubenswrapper[4834]: I1126 12:45:16.291350 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm"] Nov 26 12:45:16 crc kubenswrapper[4834]: I1126 12:45:16.511942 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" event={"ID":"90cc7502-5ac6-4d61-a70d-d8bc10da96b7","Type":"ContainerStarted","Data":"e5fa905af1ec02f2b30e89b1fae61ad95b7278f2eeac78c1795d44db9025b421"} Nov 26 12:45:17 crc kubenswrapper[4834]: I1126 12:45:17.520968 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" event={"ID":"90cc7502-5ac6-4d61-a70d-d8bc10da96b7","Type":"ContainerStarted","Data":"117540b2ce176eddd5e001f5321098ff06fe853eede107f41b0cca1f784cb1e7"} Nov 26 12:45:17 crc kubenswrapper[4834]: I1126 12:45:17.538936 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" podStartSLOduration=1.9577106610000001 podStartE2EDuration="2.538923077s" podCreationTimestamp="2025-11-26 12:45:15 +0000 UTC" firstStartedPulling="2025-11-26 12:45:16.294061336 +0000 UTC m=+2014.201274687" lastFinishedPulling="2025-11-26 12:45:16.875273752 +0000 UTC m=+2014.782487103" observedRunningTime="2025-11-26 12:45:17.533011256 +0000 UTC m=+2015.440224607" watchObservedRunningTime="2025-11-26 12:45:17.538923077 +0000 UTC m=+2015.446136430" Nov 26 12:45:46 crc kubenswrapper[4834]: I1126 12:45:46.243195 4834 scope.go:117] "RemoveContainer" containerID="da409ec7e6526213d87b86559df271455aada2847dec663f781f0a498388c1f6" Nov 26 12:46:01 crc kubenswrapper[4834]: I1126 12:46:01.850816 4834 generic.go:334] "Generic (PLEG): container finished" podID="90cc7502-5ac6-4d61-a70d-d8bc10da96b7" containerID="117540b2ce176eddd5e001f5321098ff06fe853eede107f41b0cca1f784cb1e7" exitCode=0 Nov 26 12:46:01 crc kubenswrapper[4834]: I1126 12:46:01.850930 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" event={"ID":"90cc7502-5ac6-4d61-a70d-d8bc10da96b7","Type":"ContainerDied","Data":"117540b2ce176eddd5e001f5321098ff06fe853eede107f41b0cca1f784cb1e7"} Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.170494 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.271416 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv4vd\" (UniqueName: \"kubernetes.io/projected/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-kube-api-access-nv4vd\") pod \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.271470 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-inventory\") pod \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.271500 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.271525 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-nova-metadata-neutron-config-0\") pod \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.271553 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ssh-key\") pod \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.271587 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-metadata-combined-ca-bundle\") pod \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.271619 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ceph\") pod \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\" (UID: \"90cc7502-5ac6-4d61-a70d-d8bc10da96b7\") " Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.278182 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-kube-api-access-nv4vd" (OuterVolumeSpecName: "kube-api-access-nv4vd") pod "90cc7502-5ac6-4d61-a70d-d8bc10da96b7" (UID: "90cc7502-5ac6-4d61-a70d-d8bc10da96b7"). InnerVolumeSpecName "kube-api-access-nv4vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.279382 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ceph" (OuterVolumeSpecName: "ceph") pod "90cc7502-5ac6-4d61-a70d-d8bc10da96b7" (UID: "90cc7502-5ac6-4d61-a70d-d8bc10da96b7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.279566 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "90cc7502-5ac6-4d61-a70d-d8bc10da96b7" (UID: "90cc7502-5ac6-4d61-a70d-d8bc10da96b7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.297767 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "90cc7502-5ac6-4d61-a70d-d8bc10da96b7" (UID: "90cc7502-5ac6-4d61-a70d-d8bc10da96b7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.300120 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "90cc7502-5ac6-4d61-a70d-d8bc10da96b7" (UID: "90cc7502-5ac6-4d61-a70d-d8bc10da96b7"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.300301 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-inventory" (OuterVolumeSpecName: "inventory") pod "90cc7502-5ac6-4d61-a70d-d8bc10da96b7" (UID: "90cc7502-5ac6-4d61-a70d-d8bc10da96b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.308997 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "90cc7502-5ac6-4d61-a70d-d8bc10da96b7" (UID: "90cc7502-5ac6-4d61-a70d-d8bc10da96b7"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.373046 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv4vd\" (UniqueName: \"kubernetes.io/projected/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-kube-api-access-nv4vd\") on node \"crc\" DevicePath \"\"" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.373095 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.373108 4834 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.373124 4834 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.373138 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.373150 4834 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.373165 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90cc7502-5ac6-4d61-a70d-d8bc10da96b7-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.865864 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" event={"ID":"90cc7502-5ac6-4d61-a70d-d8bc10da96b7","Type":"ContainerDied","Data":"e5fa905af1ec02f2b30e89b1fae61ad95b7278f2eeac78c1795d44db9025b421"} Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.866144 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5fa905af1ec02f2b30e89b1fae61ad95b7278f2eeac78c1795d44db9025b421" Nov 26 12:46:03 crc kubenswrapper[4834]: I1126 12:46:03.866212 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.039210 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt"] Nov 26 12:46:04 crc kubenswrapper[4834]: E1126 12:46:04.039696 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cc7502-5ac6-4d61-a70d-d8bc10da96b7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.039718 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cc7502-5ac6-4d61-a70d-d8bc10da96b7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.039916 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cc7502-5ac6-4d61-a70d-d8bc10da96b7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.040614 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.042457 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.042733 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.042909 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.043015 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.043127 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.043159 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.053274 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt"] Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.089672 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.089721 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.089788 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqr2v\" (UniqueName: \"kubernetes.io/projected/db179b40-80be-4e46-9f35-677180198b4e-kube-api-access-gqr2v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.089859 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.090125 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.090173 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.191169 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.191249 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqr2v\" (UniqueName: \"kubernetes.io/projected/db179b40-80be-4e46-9f35-677180198b4e-kube-api-access-gqr2v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.191301 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.191435 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.191470 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.191493 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.196624 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.196661 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.197022 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.197776 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.198810 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.206993 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqr2v\" (UniqueName: \"kubernetes.io/projected/db179b40-80be-4e46-9f35-677180198b4e-kube-api-access-gqr2v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-77jlt\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.360679 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.816916 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt"] Nov 26 12:46:04 crc kubenswrapper[4834]: I1126 12:46:04.873076 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" event={"ID":"db179b40-80be-4e46-9f35-677180198b4e","Type":"ContainerStarted","Data":"906e2501dc4da7d05b8c850bf8e3f2898c5bd986c41b80e2bcb4a14346f47435"} Nov 26 12:46:05 crc kubenswrapper[4834]: I1126 12:46:05.884371 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" event={"ID":"db179b40-80be-4e46-9f35-677180198b4e","Type":"ContainerStarted","Data":"2ef73e01537903894c98563e5640a0fb9037f793d9873dd4944274e2c64b0568"} Nov 26 12:46:05 crc kubenswrapper[4834]: I1126 12:46:05.904165 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" podStartSLOduration=1.315540269 podStartE2EDuration="1.904150469s" podCreationTimestamp="2025-11-26 12:46:04 +0000 UTC" firstStartedPulling="2025-11-26 12:46:04.816513369 +0000 UTC m=+2062.723726721" lastFinishedPulling="2025-11-26 12:46:05.405123569 +0000 UTC m=+2063.312336921" observedRunningTime="2025-11-26 12:46:05.899357347 +0000 UTC m=+2063.806570699" watchObservedRunningTime="2025-11-26 12:46:05.904150469 +0000 UTC m=+2063.811363820" Nov 26 12:46:51 crc kubenswrapper[4834]: I1126 12:46:51.531370 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:46:51 crc kubenswrapper[4834]: I1126 12:46:51.531981 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:47:21 crc kubenswrapper[4834]: I1126 12:47:21.531834 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:47:21 crc kubenswrapper[4834]: I1126 12:47:21.532197 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:47:51 crc kubenswrapper[4834]: I1126 12:47:51.530902 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:47:51 crc kubenswrapper[4834]: I1126 12:47:51.531252 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:47:51 crc kubenswrapper[4834]: I1126 12:47:51.531288 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:47:51 crc kubenswrapper[4834]: I1126 12:47:51.531741 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:47:51 crc kubenswrapper[4834]: I1126 12:47:51.531793 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" gracePeriod=600 Nov 26 12:47:51 crc kubenswrapper[4834]: E1126 12:47:51.651430 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:47:52 crc kubenswrapper[4834]: I1126 12:47:52.630201 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" exitCode=0 Nov 26 12:47:52 crc kubenswrapper[4834]: I1126 12:47:52.630251 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88"} Nov 26 12:47:52 crc kubenswrapper[4834]: I1126 12:47:52.630295 4834 scope.go:117] "RemoveContainer" containerID="91771d6ddc659e4116cd0d5491ab5dd50d9b99c3c9aec05e2aca8bfb28c92527" Nov 26 12:47:52 crc kubenswrapper[4834]: I1126 12:47:52.630879 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:47:52 crc kubenswrapper[4834]: E1126 12:47:52.631096 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:47:53 crc kubenswrapper[4834]: I1126 12:47:53.852889 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4jr9n"] Nov 26 12:47:53 crc kubenswrapper[4834]: I1126 12:47:53.858817 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:53 crc kubenswrapper[4834]: I1126 12:47:53.871701 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jr9n"] Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.055552 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-utilities\") pod \"community-operators-4jr9n\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.056145 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j6n7\" (UniqueName: \"kubernetes.io/projected/c9cd10e1-a122-4bb0-9270-4e958894865a-kube-api-access-9j6n7\") pod \"community-operators-4jr9n\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.056407 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-catalog-content\") pod \"community-operators-4jr9n\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.157931 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-catalog-content\") pod \"community-operators-4jr9n\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.158150 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-utilities\") pod \"community-operators-4jr9n\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.158284 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j6n7\" (UniqueName: \"kubernetes.io/projected/c9cd10e1-a122-4bb0-9270-4e958894865a-kube-api-access-9j6n7\") pod \"community-operators-4jr9n\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.158467 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-catalog-content\") pod \"community-operators-4jr9n\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.158578 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-utilities\") pod \"community-operators-4jr9n\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.188727 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j6n7\" (UniqueName: \"kubernetes.io/projected/c9cd10e1-a122-4bb0-9270-4e958894865a-kube-api-access-9j6n7\") pod \"community-operators-4jr9n\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.475762 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:47:54 crc kubenswrapper[4834]: I1126 12:47:54.846481 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jr9n"] Nov 26 12:47:55 crc kubenswrapper[4834]: I1126 12:47:55.651836 4834 generic.go:334] "Generic (PLEG): container finished" podID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerID="de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5" exitCode=0 Nov 26 12:47:55 crc kubenswrapper[4834]: I1126 12:47:55.651884 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jr9n" event={"ID":"c9cd10e1-a122-4bb0-9270-4e958894865a","Type":"ContainerDied","Data":"de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5"} Nov 26 12:47:55 crc kubenswrapper[4834]: I1126 12:47:55.652051 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jr9n" event={"ID":"c9cd10e1-a122-4bb0-9270-4e958894865a","Type":"ContainerStarted","Data":"bd6960bcf4aaf976d78034ea0f8afcea134c10d58e51ffd2cc2862b4a306482f"} Nov 26 12:47:55 crc kubenswrapper[4834]: I1126 12:47:55.653535 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 12:47:56 crc kubenswrapper[4834]: I1126 12:47:56.659396 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jr9n" event={"ID":"c9cd10e1-a122-4bb0-9270-4e958894865a","Type":"ContainerStarted","Data":"226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29"} Nov 26 12:47:57 crc kubenswrapper[4834]: I1126 12:47:57.666429 4834 generic.go:334] "Generic (PLEG): container finished" podID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerID="226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29" exitCode=0 Nov 26 12:47:57 crc kubenswrapper[4834]: I1126 12:47:57.666468 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jr9n" event={"ID":"c9cd10e1-a122-4bb0-9270-4e958894865a","Type":"ContainerDied","Data":"226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29"} Nov 26 12:47:58 crc kubenswrapper[4834]: I1126 12:47:58.675465 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jr9n" event={"ID":"c9cd10e1-a122-4bb0-9270-4e958894865a","Type":"ContainerStarted","Data":"dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7"} Nov 26 12:47:58 crc kubenswrapper[4834]: I1126 12:47:58.688167 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4jr9n" podStartSLOduration=3.14920451 podStartE2EDuration="5.688156781s" podCreationTimestamp="2025-11-26 12:47:53 +0000 UTC" firstStartedPulling="2025-11-26 12:47:55.653324997 +0000 UTC m=+2173.560538349" lastFinishedPulling="2025-11-26 12:47:58.192277268 +0000 UTC m=+2176.099490620" observedRunningTime="2025-11-26 12:47:58.686748664 +0000 UTC m=+2176.593962007" watchObservedRunningTime="2025-11-26 12:47:58.688156781 +0000 UTC m=+2176.595370133" Nov 26 12:48:04 crc kubenswrapper[4834]: I1126 12:48:04.476329 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:48:04 crc kubenswrapper[4834]: I1126 12:48:04.476953 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:48:04 crc kubenswrapper[4834]: I1126 12:48:04.511055 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:48:04 crc kubenswrapper[4834]: I1126 12:48:04.754875 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:48:04 crc kubenswrapper[4834]: I1126 12:48:04.805768 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jr9n"] Nov 26 12:48:06 crc kubenswrapper[4834]: I1126 12:48:06.417612 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:48:06 crc kubenswrapper[4834]: E1126 12:48:06.418060 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:48:06 crc kubenswrapper[4834]: I1126 12:48:06.731706 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4jr9n" podUID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerName="registry-server" containerID="cri-o://dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7" gracePeriod=2 Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.100641 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.267887 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-utilities\") pod \"c9cd10e1-a122-4bb0-9270-4e958894865a\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.268014 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j6n7\" (UniqueName: \"kubernetes.io/projected/c9cd10e1-a122-4bb0-9270-4e958894865a-kube-api-access-9j6n7\") pod \"c9cd10e1-a122-4bb0-9270-4e958894865a\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.268104 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-catalog-content\") pod \"c9cd10e1-a122-4bb0-9270-4e958894865a\" (UID: \"c9cd10e1-a122-4bb0-9270-4e958894865a\") " Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.268668 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-utilities" (OuterVolumeSpecName: "utilities") pod "c9cd10e1-a122-4bb0-9270-4e958894865a" (UID: "c9cd10e1-a122-4bb0-9270-4e958894865a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.273471 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cd10e1-a122-4bb0-9270-4e958894865a-kube-api-access-9j6n7" (OuterVolumeSpecName: "kube-api-access-9j6n7") pod "c9cd10e1-a122-4bb0-9270-4e958894865a" (UID: "c9cd10e1-a122-4bb0-9270-4e958894865a"). InnerVolumeSpecName "kube-api-access-9j6n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.370365 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j6n7\" (UniqueName: \"kubernetes.io/projected/c9cd10e1-a122-4bb0-9270-4e958894865a-kube-api-access-9j6n7\") on node \"crc\" DevicePath \"\"" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.370409 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.448765 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9cd10e1-a122-4bb0-9270-4e958894865a" (UID: "c9cd10e1-a122-4bb0-9270-4e958894865a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.471982 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cd10e1-a122-4bb0-9270-4e958894865a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.739992 4834 generic.go:334] "Generic (PLEG): container finished" podID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerID="dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7" exitCode=0 Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.740038 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jr9n" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.740056 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jr9n" event={"ID":"c9cd10e1-a122-4bb0-9270-4e958894865a","Type":"ContainerDied","Data":"dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7"} Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.740707 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jr9n" event={"ID":"c9cd10e1-a122-4bb0-9270-4e958894865a","Type":"ContainerDied","Data":"bd6960bcf4aaf976d78034ea0f8afcea134c10d58e51ffd2cc2862b4a306482f"} Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.740772 4834 scope.go:117] "RemoveContainer" containerID="dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.757557 4834 scope.go:117] "RemoveContainer" containerID="226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.768116 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jr9n"] Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.773880 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4jr9n"] Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.789388 4834 scope.go:117] "RemoveContainer" containerID="de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.812222 4834 scope.go:117] "RemoveContainer" containerID="dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7" Nov 26 12:48:07 crc kubenswrapper[4834]: E1126 12:48:07.812692 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7\": container with ID starting with dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7 not found: ID does not exist" containerID="dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.812731 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7"} err="failed to get container status \"dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7\": rpc error: code = NotFound desc = could not find container \"dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7\": container with ID starting with dde3472e56a85f0fe8752be5d2d858e3f5c9f730a8849090935d48d1246b82c7 not found: ID does not exist" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.812758 4834 scope.go:117] "RemoveContainer" containerID="226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29" Nov 26 12:48:07 crc kubenswrapper[4834]: E1126 12:48:07.813059 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29\": container with ID starting with 226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29 not found: ID does not exist" containerID="226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.813095 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29"} err="failed to get container status \"226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29\": rpc error: code = NotFound desc = could not find container \"226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29\": container with ID starting with 226f61e6933550b3437f089d14eea115603bcd2b714d61ceb3175c764959fc29 not found: ID does not exist" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.813129 4834 scope.go:117] "RemoveContainer" containerID="de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5" Nov 26 12:48:07 crc kubenswrapper[4834]: E1126 12:48:07.813457 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5\": container with ID starting with de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5 not found: ID does not exist" containerID="de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5" Nov 26 12:48:07 crc kubenswrapper[4834]: I1126 12:48:07.813477 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5"} err="failed to get container status \"de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5\": rpc error: code = NotFound desc = could not find container \"de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5\": container with ID starting with de80f4dbdfe65d2eb3aadd8160e34f649106c066e506ddce45eb5aa7a7589af5 not found: ID does not exist" Nov 26 12:48:08 crc kubenswrapper[4834]: I1126 12:48:08.424719 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cd10e1-a122-4bb0-9270-4e958894865a" path="/var/lib/kubelet/pods/c9cd10e1-a122-4bb0-9270-4e958894865a/volumes" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.456142 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-85vd7"] Nov 26 12:48:17 crc kubenswrapper[4834]: E1126 12:48:17.458167 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerName="extract-utilities" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.458239 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerName="extract-utilities" Nov 26 12:48:17 crc kubenswrapper[4834]: E1126 12:48:17.458380 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerName="registry-server" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.458447 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerName="registry-server" Nov 26 12:48:17 crc kubenswrapper[4834]: E1126 12:48:17.458520 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerName="extract-content" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.458579 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerName="extract-content" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.461084 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cd10e1-a122-4bb0-9270-4e958894865a" containerName="registry-server" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.462379 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.466914 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-85vd7"] Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.533424 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhmnb\" (UniqueName: \"kubernetes.io/projected/f3c420f8-c678-43e4-8fc2-74f456b72ae8-kube-api-access-dhmnb\") pod \"redhat-marketplace-85vd7\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.533504 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-catalog-content\") pod \"redhat-marketplace-85vd7\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.533622 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-utilities\") pod \"redhat-marketplace-85vd7\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.635288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhmnb\" (UniqueName: \"kubernetes.io/projected/f3c420f8-c678-43e4-8fc2-74f456b72ae8-kube-api-access-dhmnb\") pod \"redhat-marketplace-85vd7\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.635353 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-catalog-content\") pod \"redhat-marketplace-85vd7\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.635423 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-utilities\") pod \"redhat-marketplace-85vd7\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.635809 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-utilities\") pod \"redhat-marketplace-85vd7\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.635816 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-catalog-content\") pod \"redhat-marketplace-85vd7\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.654782 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhmnb\" (UniqueName: \"kubernetes.io/projected/f3c420f8-c678-43e4-8fc2-74f456b72ae8-kube-api-access-dhmnb\") pod \"redhat-marketplace-85vd7\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:17 crc kubenswrapper[4834]: I1126 12:48:17.795814 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:18 crc kubenswrapper[4834]: I1126 12:48:18.199106 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-85vd7"] Nov 26 12:48:18 crc kubenswrapper[4834]: I1126 12:48:18.818034 4834 generic.go:334] "Generic (PLEG): container finished" podID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerID="698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7" exitCode=0 Nov 26 12:48:18 crc kubenswrapper[4834]: I1126 12:48:18.818145 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85vd7" event={"ID":"f3c420f8-c678-43e4-8fc2-74f456b72ae8","Type":"ContainerDied","Data":"698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7"} Nov 26 12:48:18 crc kubenswrapper[4834]: I1126 12:48:18.818530 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85vd7" event={"ID":"f3c420f8-c678-43e4-8fc2-74f456b72ae8","Type":"ContainerStarted","Data":"1e9b98c17a346a0d6157575c43bb3272eb5ba1c49cec260bfd9dc9ee35f40e89"} Nov 26 12:48:19 crc kubenswrapper[4834]: I1126 12:48:19.417413 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:48:19 crc kubenswrapper[4834]: E1126 12:48:19.417704 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:48:19 crc kubenswrapper[4834]: I1126 12:48:19.827353 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85vd7" event={"ID":"f3c420f8-c678-43e4-8fc2-74f456b72ae8","Type":"ContainerStarted","Data":"566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229"} Nov 26 12:48:20 crc kubenswrapper[4834]: I1126 12:48:20.836461 4834 generic.go:334] "Generic (PLEG): container finished" podID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerID="566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229" exitCode=0 Nov 26 12:48:20 crc kubenswrapper[4834]: I1126 12:48:20.836541 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85vd7" event={"ID":"f3c420f8-c678-43e4-8fc2-74f456b72ae8","Type":"ContainerDied","Data":"566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229"} Nov 26 12:48:21 crc kubenswrapper[4834]: I1126 12:48:21.843881 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85vd7" event={"ID":"f3c420f8-c678-43e4-8fc2-74f456b72ae8","Type":"ContainerStarted","Data":"ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c"} Nov 26 12:48:21 crc kubenswrapper[4834]: I1126 12:48:21.860227 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-85vd7" podStartSLOduration=2.734547524 podStartE2EDuration="4.860209098s" podCreationTimestamp="2025-11-26 12:48:17 +0000 UTC" firstStartedPulling="2025-11-26 12:48:18.820193286 +0000 UTC m=+2196.727406639" lastFinishedPulling="2025-11-26 12:48:20.945854861 +0000 UTC m=+2198.853068213" observedRunningTime="2025-11-26 12:48:21.858553094 +0000 UTC m=+2199.765766446" watchObservedRunningTime="2025-11-26 12:48:21.860209098 +0000 UTC m=+2199.767422451" Nov 26 12:48:27 crc kubenswrapper[4834]: I1126 12:48:27.795957 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:27 crc kubenswrapper[4834]: I1126 12:48:27.796432 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:27 crc kubenswrapper[4834]: I1126 12:48:27.829538 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:27 crc kubenswrapper[4834]: I1126 12:48:27.908441 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:28 crc kubenswrapper[4834]: I1126 12:48:28.054122 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-85vd7"] Nov 26 12:48:29 crc kubenswrapper[4834]: I1126 12:48:29.889764 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-85vd7" podUID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerName="registry-server" containerID="cri-o://ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c" gracePeriod=2 Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.234914 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.331671 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-catalog-content\") pod \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.331724 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhmnb\" (UniqueName: \"kubernetes.io/projected/f3c420f8-c678-43e4-8fc2-74f456b72ae8-kube-api-access-dhmnb\") pod \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.331814 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-utilities\") pod \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\" (UID: \"f3c420f8-c678-43e4-8fc2-74f456b72ae8\") " Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.332732 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-utilities" (OuterVolumeSpecName: "utilities") pod "f3c420f8-c678-43e4-8fc2-74f456b72ae8" (UID: "f3c420f8-c678-43e4-8fc2-74f456b72ae8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.335854 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c420f8-c678-43e4-8fc2-74f456b72ae8-kube-api-access-dhmnb" (OuterVolumeSpecName: "kube-api-access-dhmnb") pod "f3c420f8-c678-43e4-8fc2-74f456b72ae8" (UID: "f3c420f8-c678-43e4-8fc2-74f456b72ae8"). InnerVolumeSpecName "kube-api-access-dhmnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.344214 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3c420f8-c678-43e4-8fc2-74f456b72ae8" (UID: "f3c420f8-c678-43e4-8fc2-74f456b72ae8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.432974 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.433000 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c420f8-c678-43e4-8fc2-74f456b72ae8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.433011 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhmnb\" (UniqueName: \"kubernetes.io/projected/f3c420f8-c678-43e4-8fc2-74f456b72ae8-kube-api-access-dhmnb\") on node \"crc\" DevicePath \"\"" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.897237 4834 generic.go:334] "Generic (PLEG): container finished" podID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerID="ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c" exitCode=0 Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.897296 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85vd7" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.897325 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85vd7" event={"ID":"f3c420f8-c678-43e4-8fc2-74f456b72ae8","Type":"ContainerDied","Data":"ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c"} Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.897603 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85vd7" event={"ID":"f3c420f8-c678-43e4-8fc2-74f456b72ae8","Type":"ContainerDied","Data":"1e9b98c17a346a0d6157575c43bb3272eb5ba1c49cec260bfd9dc9ee35f40e89"} Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.897622 4834 scope.go:117] "RemoveContainer" containerID="ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.915860 4834 scope.go:117] "RemoveContainer" containerID="566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.916302 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-85vd7"] Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.922396 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-85vd7"] Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.932073 4834 scope.go:117] "RemoveContainer" containerID="698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.958870 4834 scope.go:117] "RemoveContainer" containerID="ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c" Nov 26 12:48:30 crc kubenswrapper[4834]: E1126 12:48:30.959260 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c\": container with ID starting with ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c not found: ID does not exist" containerID="ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.959395 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c"} err="failed to get container status \"ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c\": rpc error: code = NotFound desc = could not find container \"ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c\": container with ID starting with ebedf720affa1a32bb30963c803715b5ec0e5123358cbc2a7c6885b2b4cdd87c not found: ID does not exist" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.959530 4834 scope.go:117] "RemoveContainer" containerID="566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229" Nov 26 12:48:30 crc kubenswrapper[4834]: E1126 12:48:30.960117 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229\": container with ID starting with 566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229 not found: ID does not exist" containerID="566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.960268 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229"} err="failed to get container status \"566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229\": rpc error: code = NotFound desc = could not find container \"566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229\": container with ID starting with 566dbf21e5b857617eb4936fe9fab9ee20adf0086b9b2ab5d04e519fc6c80229 not found: ID does not exist" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.960365 4834 scope.go:117] "RemoveContainer" containerID="698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7" Nov 26 12:48:30 crc kubenswrapper[4834]: E1126 12:48:30.960716 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7\": container with ID starting with 698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7 not found: ID does not exist" containerID="698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7" Nov 26 12:48:30 crc kubenswrapper[4834]: I1126 12:48:30.960805 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7"} err="failed to get container status \"698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7\": rpc error: code = NotFound desc = could not find container \"698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7\": container with ID starting with 698257ff43419db340dfb69e5bea1f4148d51d5f14bcfed394d2b2911a0a67e7 not found: ID does not exist" Nov 26 12:48:32 crc kubenswrapper[4834]: I1126 12:48:32.424985 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" path="/var/lib/kubelet/pods/f3c420f8-c678-43e4-8fc2-74f456b72ae8/volumes" Nov 26 12:48:34 crc kubenswrapper[4834]: I1126 12:48:34.417472 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:48:34 crc kubenswrapper[4834]: E1126 12:48:34.417965 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.575081 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4bwb"] Nov 26 12:48:37 crc kubenswrapper[4834]: E1126 12:48:37.575703 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerName="extract-content" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.575718 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerName="extract-content" Nov 26 12:48:37 crc kubenswrapper[4834]: E1126 12:48:37.575737 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerName="registry-server" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.575743 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerName="registry-server" Nov 26 12:48:37 crc kubenswrapper[4834]: E1126 12:48:37.575765 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerName="extract-utilities" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.575770 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerName="extract-utilities" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.575937 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c420f8-c678-43e4-8fc2-74f456b72ae8" containerName="registry-server" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.577676 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.585605 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4bwb"] Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.736032 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blhgn\" (UniqueName: \"kubernetes.io/projected/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-kube-api-access-blhgn\") pod \"redhat-operators-n4bwb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.736206 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-utilities\") pod \"redhat-operators-n4bwb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.736272 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-catalog-content\") pod \"redhat-operators-n4bwb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.838192 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-utilities\") pod \"redhat-operators-n4bwb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.838279 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-catalog-content\") pod \"redhat-operators-n4bwb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.838341 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blhgn\" (UniqueName: \"kubernetes.io/projected/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-kube-api-access-blhgn\") pod \"redhat-operators-n4bwb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.838703 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-utilities\") pod \"redhat-operators-n4bwb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.839055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-catalog-content\") pod \"redhat-operators-n4bwb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.855786 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blhgn\" (UniqueName: \"kubernetes.io/projected/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-kube-api-access-blhgn\") pod \"redhat-operators-n4bwb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:37 crc kubenswrapper[4834]: I1126 12:48:37.895494 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:38 crc kubenswrapper[4834]: I1126 12:48:38.273686 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4bwb"] Nov 26 12:48:38 crc kubenswrapper[4834]: I1126 12:48:38.949902 4834 generic.go:334] "Generic (PLEG): container finished" podID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerID="ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94" exitCode=0 Nov 26 12:48:38 crc kubenswrapper[4834]: I1126 12:48:38.950330 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4bwb" event={"ID":"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb","Type":"ContainerDied","Data":"ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94"} Nov 26 12:48:38 crc kubenswrapper[4834]: I1126 12:48:38.950366 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4bwb" event={"ID":"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb","Type":"ContainerStarted","Data":"aee1102261c2c93d7da841d8bc02c82dd5304d9ccd1aa6037ba787c0db099ee3"} Nov 26 12:48:39 crc kubenswrapper[4834]: I1126 12:48:39.958430 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4bwb" event={"ID":"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb","Type":"ContainerStarted","Data":"0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60"} Nov 26 12:48:40 crc kubenswrapper[4834]: I1126 12:48:40.966602 4834 generic.go:334] "Generic (PLEG): container finished" podID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerID="0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60" exitCode=0 Nov 26 12:48:40 crc kubenswrapper[4834]: I1126 12:48:40.966751 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4bwb" event={"ID":"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb","Type":"ContainerDied","Data":"0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60"} Nov 26 12:48:41 crc kubenswrapper[4834]: I1126 12:48:41.973981 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4bwb" event={"ID":"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb","Type":"ContainerStarted","Data":"4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145"} Nov 26 12:48:41 crc kubenswrapper[4834]: I1126 12:48:41.991833 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4bwb" podStartSLOduration=2.470141795 podStartE2EDuration="4.991816816s" podCreationTimestamp="2025-11-26 12:48:37 +0000 UTC" firstStartedPulling="2025-11-26 12:48:38.951827292 +0000 UTC m=+2216.859040645" lastFinishedPulling="2025-11-26 12:48:41.473502324 +0000 UTC m=+2219.380715666" observedRunningTime="2025-11-26 12:48:41.986814931 +0000 UTC m=+2219.894028283" watchObservedRunningTime="2025-11-26 12:48:41.991816816 +0000 UTC m=+2219.899030168" Nov 26 12:48:45 crc kubenswrapper[4834]: I1126 12:48:45.417542 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:48:45 crc kubenswrapper[4834]: E1126 12:48:45.418609 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:48:47 crc kubenswrapper[4834]: I1126 12:48:47.896246 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:47 crc kubenswrapper[4834]: I1126 12:48:47.896921 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:47 crc kubenswrapper[4834]: I1126 12:48:47.932032 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:48 crc kubenswrapper[4834]: I1126 12:48:48.053774 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:48 crc kubenswrapper[4834]: I1126 12:48:48.658664 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4bwb"] Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.033893 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4bwb" podUID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerName="registry-server" containerID="cri-o://4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145" gracePeriod=2 Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.407624 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.451189 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blhgn\" (UniqueName: \"kubernetes.io/projected/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-kube-api-access-blhgn\") pod \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.451291 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-utilities\") pod \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.451742 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-catalog-content\") pod \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\" (UID: \"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb\") " Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.451812 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-utilities" (OuterVolumeSpecName: "utilities") pod "eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" (UID: "eca4ac07-7889-48e3-b6aa-5ec9b86e9deb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.452595 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.457275 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-kube-api-access-blhgn" (OuterVolumeSpecName: "kube-api-access-blhgn") pod "eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" (UID: "eca4ac07-7889-48e3-b6aa-5ec9b86e9deb"). InnerVolumeSpecName "kube-api-access-blhgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.520813 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" (UID: "eca4ac07-7889-48e3-b6aa-5ec9b86e9deb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.554760 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blhgn\" (UniqueName: \"kubernetes.io/projected/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-kube-api-access-blhgn\") on node \"crc\" DevicePath \"\"" Nov 26 12:48:50 crc kubenswrapper[4834]: I1126 12:48:50.554806 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.043005 4834 generic.go:334] "Generic (PLEG): container finished" podID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerID="4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145" exitCode=0 Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.043069 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4bwb" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.043073 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4bwb" event={"ID":"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb","Type":"ContainerDied","Data":"4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145"} Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.043203 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4bwb" event={"ID":"eca4ac07-7889-48e3-b6aa-5ec9b86e9deb","Type":"ContainerDied","Data":"aee1102261c2c93d7da841d8bc02c82dd5304d9ccd1aa6037ba787c0db099ee3"} Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.043243 4834 scope.go:117] "RemoveContainer" containerID="4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.063355 4834 scope.go:117] "RemoveContainer" containerID="0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.071423 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4bwb"] Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.078186 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4bwb"] Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.089377 4834 scope.go:117] "RemoveContainer" containerID="ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.113161 4834 scope.go:117] "RemoveContainer" containerID="4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145" Nov 26 12:48:51 crc kubenswrapper[4834]: E1126 12:48:51.113663 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145\": container with ID starting with 4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145 not found: ID does not exist" containerID="4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.113708 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145"} err="failed to get container status \"4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145\": rpc error: code = NotFound desc = could not find container \"4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145\": container with ID starting with 4fdbcba9fce0833bcb917d8c4dc2f0c582ff81c9c294d5e3e220ed578c84f145 not found: ID does not exist" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.113744 4834 scope.go:117] "RemoveContainer" containerID="0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60" Nov 26 12:48:51 crc kubenswrapper[4834]: E1126 12:48:51.114106 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60\": container with ID starting with 0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60 not found: ID does not exist" containerID="0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.114127 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60"} err="failed to get container status \"0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60\": rpc error: code = NotFound desc = could not find container \"0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60\": container with ID starting with 0899d6cf66618df2ff1db58ba12128551d86abfcc2bd2ac39b319f506c4cac60 not found: ID does not exist" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.114140 4834 scope.go:117] "RemoveContainer" containerID="ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94" Nov 26 12:48:51 crc kubenswrapper[4834]: E1126 12:48:51.114430 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94\": container with ID starting with ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94 not found: ID does not exist" containerID="ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94" Nov 26 12:48:51 crc kubenswrapper[4834]: I1126 12:48:51.114459 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94"} err="failed to get container status \"ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94\": rpc error: code = NotFound desc = could not find container \"ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94\": container with ID starting with ffdb3ac6f61dcadd6745a2dcc4c3e421fc52af370c8c8a51fbb4e64af1d86c94 not found: ID does not exist" Nov 26 12:48:52 crc kubenswrapper[4834]: I1126 12:48:52.426755 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" path="/var/lib/kubelet/pods/eca4ac07-7889-48e3-b6aa-5ec9b86e9deb/volumes" Nov 26 12:48:56 crc kubenswrapper[4834]: I1126 12:48:56.417653 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:48:56 crc kubenswrapper[4834]: E1126 12:48:56.418544 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:49:11 crc kubenswrapper[4834]: I1126 12:49:11.182048 4834 generic.go:334] "Generic (PLEG): container finished" podID="db179b40-80be-4e46-9f35-677180198b4e" containerID="2ef73e01537903894c98563e5640a0fb9037f793d9873dd4944274e2c64b0568" exitCode=0 Nov 26 12:49:11 crc kubenswrapper[4834]: I1126 12:49:11.182130 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" event={"ID":"db179b40-80be-4e46-9f35-677180198b4e","Type":"ContainerDied","Data":"2ef73e01537903894c98563e5640a0fb9037f793d9873dd4944274e2c64b0568"} Nov 26 12:49:11 crc kubenswrapper[4834]: I1126 12:49:11.417170 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:49:11 crc kubenswrapper[4834]: E1126 12:49:11.417515 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.534115 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.624357 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-combined-ca-bundle\") pod \"db179b40-80be-4e46-9f35-677180198b4e\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.624415 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-secret-0\") pod \"db179b40-80be-4e46-9f35-677180198b4e\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.624451 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqr2v\" (UniqueName: \"kubernetes.io/projected/db179b40-80be-4e46-9f35-677180198b4e-kube-api-access-gqr2v\") pod \"db179b40-80be-4e46-9f35-677180198b4e\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.624514 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-inventory\") pod \"db179b40-80be-4e46-9f35-677180198b4e\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.624552 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ssh-key\") pod \"db179b40-80be-4e46-9f35-677180198b4e\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.624643 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ceph\") pod \"db179b40-80be-4e46-9f35-677180198b4e\" (UID: \"db179b40-80be-4e46-9f35-677180198b4e\") " Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.630191 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "db179b40-80be-4e46-9f35-677180198b4e" (UID: "db179b40-80be-4e46-9f35-677180198b4e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.630691 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ceph" (OuterVolumeSpecName: "ceph") pod "db179b40-80be-4e46-9f35-677180198b4e" (UID: "db179b40-80be-4e46-9f35-677180198b4e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.631539 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db179b40-80be-4e46-9f35-677180198b4e-kube-api-access-gqr2v" (OuterVolumeSpecName: "kube-api-access-gqr2v") pod "db179b40-80be-4e46-9f35-677180198b4e" (UID: "db179b40-80be-4e46-9f35-677180198b4e"). InnerVolumeSpecName "kube-api-access-gqr2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.650491 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-inventory" (OuterVolumeSpecName: "inventory") pod "db179b40-80be-4e46-9f35-677180198b4e" (UID: "db179b40-80be-4e46-9f35-677180198b4e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.654929 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "db179b40-80be-4e46-9f35-677180198b4e" (UID: "db179b40-80be-4e46-9f35-677180198b4e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.655555 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "db179b40-80be-4e46-9f35-677180198b4e" (UID: "db179b40-80be-4e46-9f35-677180198b4e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.726941 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.726988 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.726999 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.727009 4834 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.727023 4834 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db179b40-80be-4e46-9f35-677180198b4e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:49:12 crc kubenswrapper[4834]: I1126 12:49:12.727047 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqr2v\" (UniqueName: \"kubernetes.io/projected/db179b40-80be-4e46-9f35-677180198b4e-kube-api-access-gqr2v\") on node \"crc\" DevicePath \"\"" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.199194 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" event={"ID":"db179b40-80be-4e46-9f35-677180198b4e","Type":"ContainerDied","Data":"906e2501dc4da7d05b8c850bf8e3f2898c5bd986c41b80e2bcb4a14346f47435"} Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.199683 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="906e2501dc4da7d05b8c850bf8e3f2898c5bd986c41b80e2bcb4a14346f47435" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.199742 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-77jlt" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.287146 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz"] Nov 26 12:49:13 crc kubenswrapper[4834]: E1126 12:49:13.287712 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerName="extract-utilities" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.287737 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerName="extract-utilities" Nov 26 12:49:13 crc kubenswrapper[4834]: E1126 12:49:13.287759 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerName="registry-server" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.287767 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerName="registry-server" Nov 26 12:49:13 crc kubenswrapper[4834]: E1126 12:49:13.287793 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db179b40-80be-4e46-9f35-677180198b4e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.287800 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="db179b40-80be-4e46-9f35-677180198b4e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 26 12:49:13 crc kubenswrapper[4834]: E1126 12:49:13.287820 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerName="extract-content" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.287826 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerName="extract-content" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.288001 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca4ac07-7889-48e3-b6aa-5ec9b86e9deb" containerName="registry-server" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.288048 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="db179b40-80be-4e46-9f35-677180198b4e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.288816 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.291727 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.291978 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.292113 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.292405 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.292531 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.292686 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ps9tl" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.292812 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.292930 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.293057 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.294487 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz"] Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.347536 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.347653 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.347805 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.347901 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.348128 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.348236 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.348278 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.348330 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbzvg\" (UniqueName: \"kubernetes.io/projected/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-kube-api-access-tbzvg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.348416 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.348492 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.348609 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.449889 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.449939 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.450023 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.450089 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.450115 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.450262 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbzvg\" (UniqueName: \"kubernetes.io/projected/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-kube-api-access-tbzvg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.450294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.450333 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.451148 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.451709 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.451798 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.451851 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.452428 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.454952 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.455194 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.455261 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.455351 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.455953 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.456003 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.456841 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.457044 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.465679 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbzvg\" (UniqueName: \"kubernetes.io/projected/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-kube-api-access-tbzvg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:13 crc kubenswrapper[4834]: I1126 12:49:13.604112 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:49:14 crc kubenswrapper[4834]: I1126 12:49:14.028231 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz"] Nov 26 12:49:14 crc kubenswrapper[4834]: I1126 12:49:14.205817 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" event={"ID":"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17","Type":"ContainerStarted","Data":"5ea8edec06afabbef10ac24ab60bcc237f8af8f2cbe4f5c3439309b9b1de90ae"} Nov 26 12:49:15 crc kubenswrapper[4834]: I1126 12:49:15.213576 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" event={"ID":"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17","Type":"ContainerStarted","Data":"dcb48e176bde9e74d6d37333af95ee5676bae3201349124a0152214fb33d8fb1"} Nov 26 12:49:15 crc kubenswrapper[4834]: I1126 12:49:15.230272 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" podStartSLOduration=1.622454651 podStartE2EDuration="2.230260922s" podCreationTimestamp="2025-11-26 12:49:13 +0000 UTC" firstStartedPulling="2025-11-26 12:49:14.047367922 +0000 UTC m=+2251.954581273" lastFinishedPulling="2025-11-26 12:49:14.655174191 +0000 UTC m=+2252.562387544" observedRunningTime="2025-11-26 12:49:15.22930594 +0000 UTC m=+2253.136519292" watchObservedRunningTime="2025-11-26 12:49:15.230260922 +0000 UTC m=+2253.137474273" Nov 26 12:49:22 crc kubenswrapper[4834]: I1126 12:49:22.421608 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:49:22 crc kubenswrapper[4834]: E1126 12:49:22.423388 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.018468 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7hjzd"] Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.020660 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.029240 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7hjzd"] Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.064277 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-catalog-content\") pod \"certified-operators-7hjzd\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.064329 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-utilities\") pod \"certified-operators-7hjzd\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.064456 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g64d5\" (UniqueName: \"kubernetes.io/projected/f1dbdcac-b4d9-49bd-a980-03d1570e2399-kube-api-access-g64d5\") pod \"certified-operators-7hjzd\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.166477 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g64d5\" (UniqueName: \"kubernetes.io/projected/f1dbdcac-b4d9-49bd-a980-03d1570e2399-kube-api-access-g64d5\") pod \"certified-operators-7hjzd\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.166595 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-catalog-content\") pod \"certified-operators-7hjzd\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.166653 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-utilities\") pod \"certified-operators-7hjzd\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.167187 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-catalog-content\") pod \"certified-operators-7hjzd\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.167213 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-utilities\") pod \"certified-operators-7hjzd\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.185747 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g64d5\" (UniqueName: \"kubernetes.io/projected/f1dbdcac-b4d9-49bd-a980-03d1570e2399-kube-api-access-g64d5\") pod \"certified-operators-7hjzd\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.338423 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:26 crc kubenswrapper[4834]: I1126 12:49:26.815824 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7hjzd"] Nov 26 12:49:26 crc kubenswrapper[4834]: W1126 12:49:26.820079 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1dbdcac_b4d9_49bd_a980_03d1570e2399.slice/crio-2e85f77afc1be48e28c935b1c778ef665aa06f2415ac69bdfb58ada95e4cd0d0 WatchSource:0}: Error finding container 2e85f77afc1be48e28c935b1c778ef665aa06f2415ac69bdfb58ada95e4cd0d0: Status 404 returned error can't find the container with id 2e85f77afc1be48e28c935b1c778ef665aa06f2415ac69bdfb58ada95e4cd0d0 Nov 26 12:49:27 crc kubenswrapper[4834]: I1126 12:49:27.309151 4834 generic.go:334] "Generic (PLEG): container finished" podID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerID="e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc" exitCode=0 Nov 26 12:49:27 crc kubenswrapper[4834]: I1126 12:49:27.309195 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjzd" event={"ID":"f1dbdcac-b4d9-49bd-a980-03d1570e2399","Type":"ContainerDied","Data":"e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc"} Nov 26 12:49:27 crc kubenswrapper[4834]: I1126 12:49:27.309232 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjzd" event={"ID":"f1dbdcac-b4d9-49bd-a980-03d1570e2399","Type":"ContainerStarted","Data":"2e85f77afc1be48e28c935b1c778ef665aa06f2415ac69bdfb58ada95e4cd0d0"} Nov 26 12:49:29 crc kubenswrapper[4834]: I1126 12:49:29.327010 4834 generic.go:334] "Generic (PLEG): container finished" podID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerID="90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9" exitCode=0 Nov 26 12:49:29 crc kubenswrapper[4834]: I1126 12:49:29.327080 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjzd" event={"ID":"f1dbdcac-b4d9-49bd-a980-03d1570e2399","Type":"ContainerDied","Data":"90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9"} Nov 26 12:49:30 crc kubenswrapper[4834]: I1126 12:49:30.339225 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjzd" event={"ID":"f1dbdcac-b4d9-49bd-a980-03d1570e2399","Type":"ContainerStarted","Data":"09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f"} Nov 26 12:49:30 crc kubenswrapper[4834]: I1126 12:49:30.363826 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7hjzd" podStartSLOduration=1.842664785 podStartE2EDuration="4.363799518s" podCreationTimestamp="2025-11-26 12:49:26 +0000 UTC" firstStartedPulling="2025-11-26 12:49:27.311035329 +0000 UTC m=+2265.218248680" lastFinishedPulling="2025-11-26 12:49:29.832170061 +0000 UTC m=+2267.739383413" observedRunningTime="2025-11-26 12:49:30.357379969 +0000 UTC m=+2268.264593321" watchObservedRunningTime="2025-11-26 12:49:30.363799518 +0000 UTC m=+2268.271012870" Nov 26 12:49:36 crc kubenswrapper[4834]: I1126 12:49:36.338639 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:36 crc kubenswrapper[4834]: I1126 12:49:36.339146 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:36 crc kubenswrapper[4834]: I1126 12:49:36.372984 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:36 crc kubenswrapper[4834]: I1126 12:49:36.411563 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:36 crc kubenswrapper[4834]: I1126 12:49:36.605514 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7hjzd"] Nov 26 12:49:37 crc kubenswrapper[4834]: I1126 12:49:37.417329 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:49:37 crc kubenswrapper[4834]: E1126 12:49:37.417559 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:49:38 crc kubenswrapper[4834]: I1126 12:49:38.391729 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7hjzd" podUID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerName="registry-server" containerID="cri-o://09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f" gracePeriod=2 Nov 26 12:49:38 crc kubenswrapper[4834]: I1126 12:49:38.747190 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:38 crc kubenswrapper[4834]: I1126 12:49:38.899669 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g64d5\" (UniqueName: \"kubernetes.io/projected/f1dbdcac-b4d9-49bd-a980-03d1570e2399-kube-api-access-g64d5\") pod \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " Nov 26 12:49:38 crc kubenswrapper[4834]: I1126 12:49:38.900076 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-utilities\") pod \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " Nov 26 12:49:38 crc kubenswrapper[4834]: I1126 12:49:38.900336 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-catalog-content\") pod \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\" (UID: \"f1dbdcac-b4d9-49bd-a980-03d1570e2399\") " Nov 26 12:49:38 crc kubenswrapper[4834]: I1126 12:49:38.900741 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-utilities" (OuterVolumeSpecName: "utilities") pod "f1dbdcac-b4d9-49bd-a980-03d1570e2399" (UID: "f1dbdcac-b4d9-49bd-a980-03d1570e2399"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:49:38 crc kubenswrapper[4834]: I1126 12:49:38.901030 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:49:38 crc kubenswrapper[4834]: I1126 12:49:38.905409 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1dbdcac-b4d9-49bd-a980-03d1570e2399-kube-api-access-g64d5" (OuterVolumeSpecName: "kube-api-access-g64d5") pod "f1dbdcac-b4d9-49bd-a980-03d1570e2399" (UID: "f1dbdcac-b4d9-49bd-a980-03d1570e2399"). InnerVolumeSpecName "kube-api-access-g64d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.004016 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g64d5\" (UniqueName: \"kubernetes.io/projected/f1dbdcac-b4d9-49bd-a980-03d1570e2399-kube-api-access-g64d5\") on node \"crc\" DevicePath \"\"" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.110444 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1dbdcac-b4d9-49bd-a980-03d1570e2399" (UID: "f1dbdcac-b4d9-49bd-a980-03d1570e2399"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.209290 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1dbdcac-b4d9-49bd-a980-03d1570e2399-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.398457 4834 generic.go:334] "Generic (PLEG): container finished" podID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerID="09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f" exitCode=0 Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.398488 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjzd" event={"ID":"f1dbdcac-b4d9-49bd-a980-03d1570e2399","Type":"ContainerDied","Data":"09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f"} Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.398520 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hjzd" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.398540 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjzd" event={"ID":"f1dbdcac-b4d9-49bd-a980-03d1570e2399","Type":"ContainerDied","Data":"2e85f77afc1be48e28c935b1c778ef665aa06f2415ac69bdfb58ada95e4cd0d0"} Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.398561 4834 scope.go:117] "RemoveContainer" containerID="09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.419283 4834 scope.go:117] "RemoveContainer" containerID="90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.432284 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7hjzd"] Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.439521 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7hjzd"] Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.451532 4834 scope.go:117] "RemoveContainer" containerID="e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.472822 4834 scope.go:117] "RemoveContainer" containerID="09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f" Nov 26 12:49:39 crc kubenswrapper[4834]: E1126 12:49:39.473225 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f\": container with ID starting with 09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f not found: ID does not exist" containerID="09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.473339 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f"} err="failed to get container status \"09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f\": rpc error: code = NotFound desc = could not find container \"09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f\": container with ID starting with 09aa7e6ea0193e9b072cf2d21b545550080b60ecfc0b339209ca1da832242c7f not found: ID does not exist" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.473443 4834 scope.go:117] "RemoveContainer" containerID="90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9" Nov 26 12:49:39 crc kubenswrapper[4834]: E1126 12:49:39.473852 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9\": container with ID starting with 90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9 not found: ID does not exist" containerID="90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.473876 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9"} err="failed to get container status \"90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9\": rpc error: code = NotFound desc = could not find container \"90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9\": container with ID starting with 90a30dc6a29550da9051767dd7b5ad872335324f46f6784060fb94ad3230c5a9 not found: ID does not exist" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.473890 4834 scope.go:117] "RemoveContainer" containerID="e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc" Nov 26 12:49:39 crc kubenswrapper[4834]: E1126 12:49:39.474161 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc\": container with ID starting with e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc not found: ID does not exist" containerID="e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc" Nov 26 12:49:39 crc kubenswrapper[4834]: I1126 12:49:39.474230 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc"} err="failed to get container status \"e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc\": rpc error: code = NotFound desc = could not find container \"e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc\": container with ID starting with e3d54f1bd9416f5bd7bccb28fddad018de1312715ee8648106f749f9772870dc not found: ID does not exist" Nov 26 12:49:40 crc kubenswrapper[4834]: I1126 12:49:40.425637 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" path="/var/lib/kubelet/pods/f1dbdcac-b4d9-49bd-a980-03d1570e2399/volumes" Nov 26 12:49:52 crc kubenswrapper[4834]: I1126 12:49:52.421927 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:49:52 crc kubenswrapper[4834]: E1126 12:49:52.422821 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:50:04 crc kubenswrapper[4834]: I1126 12:50:04.418849 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:50:04 crc kubenswrapper[4834]: E1126 12:50:04.419458 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:50:17 crc kubenswrapper[4834]: I1126 12:50:17.416689 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:50:17 crc kubenswrapper[4834]: E1126 12:50:17.417542 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:50:32 crc kubenswrapper[4834]: I1126 12:50:32.421045 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:50:32 crc kubenswrapper[4834]: E1126 12:50:32.421603 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:50:47 crc kubenswrapper[4834]: I1126 12:50:47.417083 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:50:47 crc kubenswrapper[4834]: E1126 12:50:47.417799 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:50:59 crc kubenswrapper[4834]: I1126 12:50:59.417601 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:50:59 crc kubenswrapper[4834]: E1126 12:50:59.418248 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:51:13 crc kubenswrapper[4834]: I1126 12:51:13.416859 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:51:13 crc kubenswrapper[4834]: E1126 12:51:13.417377 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:51:23 crc kubenswrapper[4834]: I1126 12:51:23.050729 4834 generic.go:334] "Generic (PLEG): container finished" podID="c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" containerID="dcb48e176bde9e74d6d37333af95ee5676bae3201349124a0152214fb33d8fb1" exitCode=0 Nov 26 12:51:23 crc kubenswrapper[4834]: I1126 12:51:23.050787 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" event={"ID":"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17","Type":"ContainerDied","Data":"dcb48e176bde9e74d6d37333af95ee5676bae3201349124a0152214fb33d8fb1"} Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.363153 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476370 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-1\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476423 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-inventory\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476475 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-1\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476523 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph-nova-0\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476574 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476600 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ssh-key\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476685 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-custom-ceph-combined-ca-bundle\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476717 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-0\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476740 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-extra-config-0\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476763 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-0\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.476796 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbzvg\" (UniqueName: \"kubernetes.io/projected/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-kube-api-access-tbzvg\") pod \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\" (UID: \"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17\") " Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.484665 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.485625 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph" (OuterVolumeSpecName: "ceph") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.486012 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-kube-api-access-tbzvg" (OuterVolumeSpecName: "kube-api-access-tbzvg") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "kube-api-access-tbzvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.497835 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-inventory" (OuterVolumeSpecName: "inventory") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.498789 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.499369 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.500349 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.500755 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.500892 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.501057 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.501999 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" (UID: "c3dd49ba-b6b6-4dd2-be83-7c04977d2b17"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579102 4834 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579292 4834 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579370 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579424 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579484 4834 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579545 4834 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579596 4834 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579644 4834 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579697 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbzvg\" (UniqueName: \"kubernetes.io/projected/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-kube-api-access-tbzvg\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579751 4834 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:24 crc kubenswrapper[4834]: I1126 12:51:24.579805 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3dd49ba-b6b6-4dd2-be83-7c04977d2b17-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:25 crc kubenswrapper[4834]: I1126 12:51:25.064985 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" event={"ID":"c3dd49ba-b6b6-4dd2-be83-7c04977d2b17","Type":"ContainerDied","Data":"5ea8edec06afabbef10ac24ab60bcc237f8af8f2cbe4f5c3439309b9b1de90ae"} Nov 26 12:51:25 crc kubenswrapper[4834]: I1126 12:51:25.065331 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ea8edec06afabbef10ac24ab60bcc237f8af8f2cbe4f5c3439309b9b1de90ae" Nov 26 12:51:25 crc kubenswrapper[4834]: I1126 12:51:25.065038 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz" Nov 26 12:51:27 crc kubenswrapper[4834]: I1126 12:51:27.416680 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:51:27 crc kubenswrapper[4834]: E1126 12:51:27.417764 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.932942 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Nov 26 12:51:35 crc kubenswrapper[4834]: E1126 12:51:35.933635 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.933709 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 26 12:51:35 crc kubenswrapper[4834]: E1126 12:51:35.933722 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerName="extract-content" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.933727 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerName="extract-content" Nov 26 12:51:35 crc kubenswrapper[4834]: E1126 12:51:35.933743 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerName="extract-utilities" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.933749 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerName="extract-utilities" Nov 26 12:51:35 crc kubenswrapper[4834]: E1126 12:51:35.933759 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerName="registry-server" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.933764 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerName="registry-server" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.933914 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1dbdcac-b4d9-49bd-a980-03d1570e2399" containerName="registry-server" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.933939 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3dd49ba-b6b6-4dd2-be83-7c04977d2b17" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.934741 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.936417 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.939564 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.940618 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.952353 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.953182 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.955127 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 26 12:51:35 crc kubenswrapper[4834]: I1126 12:51:35.961284 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.041875 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.041921 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.041941 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.041960 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkxbd\" (UniqueName: \"kubernetes.io/projected/e978c710-b8cc-4608-a8b7-32619386447c-kube-api-access-zkxbd\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.041978 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042098 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-lib-modules\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042149 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042177 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042200 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042220 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-dev\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042363 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042393 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042419 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-config-data\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042468 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-sys\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042573 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042621 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042660 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042687 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042714 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042745 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs5ds\" (UniqueName: \"kubernetes.io/projected/475f5c3e-098e-4a99-83da-c2513b5d0ed7-kube-api-access-qs5ds\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042824 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042915 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-run\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042935 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042958 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e978c710-b8cc-4608-a8b7-32619386447c-ceph\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042981 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.042996 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.043045 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.043085 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/475f5c3e-098e-4a99-83da-c2513b5d0ed7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.043105 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-run\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.043138 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.043163 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.043188 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-scripts\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144766 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-run\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144801 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144825 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e978c710-b8cc-4608-a8b7-32619386447c-ceph\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144845 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144861 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144877 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144897 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-run\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144906 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/475f5c3e-098e-4a99-83da-c2513b5d0ed7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144933 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144967 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-run\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.144993 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145028 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145012 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145047 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-run\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145066 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145102 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-scripts\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145158 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145172 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145195 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkxbd\" (UniqueName: \"kubernetes.io/projected/e978c710-b8cc-4608-a8b7-32619386447c-kube-api-access-zkxbd\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145208 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145241 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-lib-modules\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145279 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145300 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145341 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145362 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-dev\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145400 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145415 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145436 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-config-data\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145456 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-sys\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145531 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145561 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-lib-modules\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145583 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-dev\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145602 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145991 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145994 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146011 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146026 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146048 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-sys\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.145535 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146062 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146074 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146095 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146119 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146136 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146168 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs5ds\" (UniqueName: \"kubernetes.io/projected/475f5c3e-098e-4a99-83da-c2513b5d0ed7-kube-api-access-qs5ds\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146197 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146226 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146235 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146324 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/475f5c3e-098e-4a99-83da-c2513b5d0ed7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146395 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.146544 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e978c710-b8cc-4608-a8b7-32619386447c-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.151179 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-config-data\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.151542 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.151750 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-scripts\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.151985 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.152210 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.152423 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e978c710-b8cc-4608-a8b7-32619386447c-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.154751 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/475f5c3e-098e-4a99-83da-c2513b5d0ed7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.159331 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.160456 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475f5c3e-098e-4a99-83da-c2513b5d0ed7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.161214 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkxbd\" (UniqueName: \"kubernetes.io/projected/e978c710-b8cc-4608-a8b7-32619386447c-kube-api-access-zkxbd\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.161913 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs5ds\" (UniqueName: \"kubernetes.io/projected/475f5c3e-098e-4a99-83da-c2513b5d0ed7-kube-api-access-qs5ds\") pod \"cinder-volume-volume1-0\" (UID: \"475f5c3e-098e-4a99-83da-c2513b5d0ed7\") " pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.170700 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e978c710-b8cc-4608-a8b7-32619386447c-ceph\") pod \"cinder-backup-0\" (UID: \"e978c710-b8cc-4608-a8b7-32619386447c\") " pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.257465 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.264840 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.461182 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-b2qmr"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.462788 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.469969 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-b2qmr"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.537841 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-246d-account-create-update-9jk9p"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.538928 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.541239 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.543961 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-246d-account-create-update-9jk9p"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.665600 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llhbr\" (UniqueName: \"kubernetes.io/projected/69f2a300-39e4-4bfc-bb9a-5646fe44709c-kube-api-access-llhbr\") pod \"manila-db-create-b2qmr\" (UID: \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\") " pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.665660 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915f4671-4ace-4684-b095-75ee89fc9c7b-operator-scripts\") pod \"manila-246d-account-create-update-9jk9p\" (UID: \"915f4671-4ace-4684-b095-75ee89fc9c7b\") " pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.665735 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f2a300-39e4-4bfc-bb9a-5646fe44709c-operator-scripts\") pod \"manila-db-create-b2qmr\" (UID: \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\") " pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.665761 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x84f\" (UniqueName: \"kubernetes.io/projected/915f4671-4ace-4684-b095-75ee89fc9c7b-kube-api-access-8x84f\") pod \"manila-246d-account-create-update-9jk9p\" (UID: \"915f4671-4ace-4684-b095-75ee89fc9c7b\") " pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.741326 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.769534 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llhbr\" (UniqueName: \"kubernetes.io/projected/69f2a300-39e4-4bfc-bb9a-5646fe44709c-kube-api-access-llhbr\") pod \"manila-db-create-b2qmr\" (UID: \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\") " pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.769594 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915f4671-4ace-4684-b095-75ee89fc9c7b-operator-scripts\") pod \"manila-246d-account-create-update-9jk9p\" (UID: \"915f4671-4ace-4684-b095-75ee89fc9c7b\") " pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.769639 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f2a300-39e4-4bfc-bb9a-5646fe44709c-operator-scripts\") pod \"manila-db-create-b2qmr\" (UID: \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\") " pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.769664 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x84f\" (UniqueName: \"kubernetes.io/projected/915f4671-4ace-4684-b095-75ee89fc9c7b-kube-api-access-8x84f\") pod \"manila-246d-account-create-update-9jk9p\" (UID: \"915f4671-4ace-4684-b095-75ee89fc9c7b\") " pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.770364 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f2a300-39e4-4bfc-bb9a-5646fe44709c-operator-scripts\") pod \"manila-db-create-b2qmr\" (UID: \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\") " pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.770395 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915f4671-4ace-4684-b095-75ee89fc9c7b-operator-scripts\") pod \"manila-246d-account-create-update-9jk9p\" (UID: \"915f4671-4ace-4684-b095-75ee89fc9c7b\") " pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.778440 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.780821 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.783274 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.783537 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.783634 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.783716 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9wfxh" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.785285 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llhbr\" (UniqueName: \"kubernetes.io/projected/69f2a300-39e4-4bfc-bb9a-5646fe44709c-kube-api-access-llhbr\") pod \"manila-db-create-b2qmr\" (UID: \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\") " pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.785949 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x84f\" (UniqueName: \"kubernetes.io/projected/915f4671-4ace-4684-b095-75ee89fc9c7b-kube-api-access-8x84f\") pod \"manila-246d-account-create-update-9jk9p\" (UID: \"915f4671-4ace-4684-b095-75ee89fc9c7b\") " pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.794235 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.815359 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.817912 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.819599 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.819635 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.825397 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.837867 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.852421 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.979897 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/659dc5b8-db0c-47cd-9c94-ba96b4256129-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.979957 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm68v\" (UniqueName: \"kubernetes.io/projected/fdfe0239-1020-40b2-9031-02cd04267ad0-kube-api-access-pm68v\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980194 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659dc5b8-db0c-47cd-9c94-ba96b4256129-logs\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980213 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdfe0239-1020-40b2-9031-02cd04267ad0-logs\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980236 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980257 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktz4m\" (UniqueName: \"kubernetes.io/projected/659dc5b8-db0c-47cd-9c94-ba96b4256129-kube-api-access-ktz4m\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980284 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980448 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-scripts\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980490 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980550 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-config-data\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980579 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980656 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980767 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/659dc5b8-db0c-47cd-9c94-ba96b4256129-ceph\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980841 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980895 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fdfe0239-1020-40b2-9031-02cd04267ad0-ceph\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980936 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdfe0239-1020-40b2-9031-02cd04267ad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:36 crc kubenswrapper[4834]: I1126 12:51:36.980961 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.079063 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083057 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083149 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/659dc5b8-db0c-47cd-9c94-ba96b4256129-ceph\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083204 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083237 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fdfe0239-1020-40b2-9031-02cd04267ad0-ceph\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083286 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdfe0239-1020-40b2-9031-02cd04267ad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083304 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083365 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/659dc5b8-db0c-47cd-9c94-ba96b4256129-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083385 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm68v\" (UniqueName: \"kubernetes.io/projected/fdfe0239-1020-40b2-9031-02cd04267ad0-kube-api-access-pm68v\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083437 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083470 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659dc5b8-db0c-47cd-9c94-ba96b4256129-logs\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083505 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdfe0239-1020-40b2-9031-02cd04267ad0-logs\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083520 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083535 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktz4m\" (UniqueName: \"kubernetes.io/projected/659dc5b8-db0c-47cd-9c94-ba96b4256129-kube-api-access-ktz4m\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083550 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083594 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-scripts\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083611 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083636 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-config-data\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083671 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.084105 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.084398 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659dc5b8-db0c-47cd-9c94-ba96b4256129-logs\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.084668 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdfe0239-1020-40b2-9031-02cd04267ad0-logs\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.083382 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.085130 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/659dc5b8-db0c-47cd-9c94-ba96b4256129-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.089818 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/659dc5b8-db0c-47cd-9c94-ba96b4256129-ceph\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.089873 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fdfe0239-1020-40b2-9031-02cd04267ad0-ceph\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.090550 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdfe0239-1020-40b2-9031-02cd04267ad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.098249 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.098876 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.099554 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.102033 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-config-data\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.102145 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.102199 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdfe0239-1020-40b2-9031-02cd04267ad0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.102648 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.107302 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktz4m\" (UniqueName: \"kubernetes.io/projected/659dc5b8-db0c-47cd-9c94-ba96b4256129-kube-api-access-ktz4m\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.108618 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm68v\" (UniqueName: \"kubernetes.io/projected/fdfe0239-1020-40b2-9031-02cd04267ad0-kube-api-access-pm68v\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.110416 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/659dc5b8-db0c-47cd-9c94-ba96b4256129-scripts\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.120712 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"659dc5b8-db0c-47cd-9c94-ba96b4256129\") " pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.122716 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fdfe0239-1020-40b2-9031-02cd04267ad0\") " pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.137657 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"475f5c3e-098e-4a99-83da-c2513b5d0ed7","Type":"ContainerStarted","Data":"173de4c57427320008b596dd13f8e50e3d1cb449aa05f7074042ff74c7884321"} Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.138595 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.138780 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e978c710-b8cc-4608-a8b7-32619386447c","Type":"ContainerStarted","Data":"35dc01efaf43b4685a6a0eef81d3c8507c5c8ffe789abad3c0fc604b0b051da6"} Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.247946 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-246d-account-create-update-9jk9p"] Nov 26 12:51:37 crc kubenswrapper[4834]: W1126 12:51:37.254637 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod915f4671_4ace_4684_b095_75ee89fc9c7b.slice/crio-c28c661d4e1f998b517508fb9ecef2ebd3fa626e2811fdca9e2f54b9d46cf61d WatchSource:0}: Error finding container c28c661d4e1f998b517508fb9ecef2ebd3fa626e2811fdca9e2f54b9d46cf61d: Status 404 returned error can't find the container with id c28c661d4e1f998b517508fb9ecef2ebd3fa626e2811fdca9e2f54b9d46cf61d Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.397609 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.513765 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-b2qmr"] Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.552686 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 12:51:37 crc kubenswrapper[4834]: I1126 12:51:37.990293 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.150471 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"475f5c3e-098e-4a99-83da-c2513b5d0ed7","Type":"ContainerStarted","Data":"9b620ce18b47087116904fa5db5fcd84c8b03078a49bfb9d0816792c4031894c"} Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.154348 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"659dc5b8-db0c-47cd-9c94-ba96b4256129","Type":"ContainerStarted","Data":"af6ec2fe599ac8f5915250fbe6884533c7ab079f99ccacaf2e13b17ded807117"} Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.154380 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"659dc5b8-db0c-47cd-9c94-ba96b4256129","Type":"ContainerStarted","Data":"c1ec10380440669f135d45ce3940c97ef6bb0cfeb2962953a87fcb11d295cadc"} Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.157521 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e978c710-b8cc-4608-a8b7-32619386447c","Type":"ContainerStarted","Data":"67cb2b66fad6bf946b6d525138a63757b0906a117b054402abbf0e155a8c7f92"} Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.158946 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdfe0239-1020-40b2-9031-02cd04267ad0","Type":"ContainerStarted","Data":"92fd8146ae73722862879d2efb9cb7fb04ead15357a36cd5164517c9e9c21ba2"} Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.161146 4834 generic.go:334] "Generic (PLEG): container finished" podID="915f4671-4ace-4684-b095-75ee89fc9c7b" containerID="852f8b0ee55db909f7916448f36100bacb1e2af50b181a0ac9283d65b0a254ab" exitCode=0 Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.161215 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-246d-account-create-update-9jk9p" event={"ID":"915f4671-4ace-4684-b095-75ee89fc9c7b","Type":"ContainerDied","Data":"852f8b0ee55db909f7916448f36100bacb1e2af50b181a0ac9283d65b0a254ab"} Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.161243 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-246d-account-create-update-9jk9p" event={"ID":"915f4671-4ace-4684-b095-75ee89fc9c7b","Type":"ContainerStarted","Data":"c28c661d4e1f998b517508fb9ecef2ebd3fa626e2811fdca9e2f54b9d46cf61d"} Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.163143 4834 generic.go:334] "Generic (PLEG): container finished" podID="69f2a300-39e4-4bfc-bb9a-5646fe44709c" containerID="b5bfd1eadc4d4ac6a9959fbcc6c60e8fee14a95279a5d8a53268539a77c817c0" exitCode=0 Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.163184 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-b2qmr" event={"ID":"69f2a300-39e4-4bfc-bb9a-5646fe44709c","Type":"ContainerDied","Data":"b5bfd1eadc4d4ac6a9959fbcc6c60e8fee14a95279a5d8a53268539a77c817c0"} Nov 26 12:51:38 crc kubenswrapper[4834]: I1126 12:51:38.163226 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-b2qmr" event={"ID":"69f2a300-39e4-4bfc-bb9a-5646fe44709c","Type":"ContainerStarted","Data":"4c1e2f01f9e2affdde5fe0367e6e4ef613e60c5286784d966fe5a4a200ea3b31"} Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.172511 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"475f5c3e-098e-4a99-83da-c2513b5d0ed7","Type":"ContainerStarted","Data":"cff9d41d8699a781a31a3e155739bc404c6a3f960833e13f5067b6b551691c19"} Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.175396 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"659dc5b8-db0c-47cd-9c94-ba96b4256129","Type":"ContainerStarted","Data":"3d8776d3ee11f2598da26b966c908786788bc7a5a3bfbf55acaad0658f494266"} Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.177398 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e978c710-b8cc-4608-a8b7-32619386447c","Type":"ContainerStarted","Data":"d8ca5bd00d04fd6b0332f8ee33bbdb3399fba68cfbac664638649c7958142bd5"} Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.179812 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdfe0239-1020-40b2-9031-02cd04267ad0","Type":"ContainerStarted","Data":"4d917d4b0a6f185e13060be069f0d6655ca69ae0434c80541debe861dc190898"} Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.179854 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdfe0239-1020-40b2-9031-02cd04267ad0","Type":"ContainerStarted","Data":"46c9053409f5ce8da6236c9d02b29569f8a7aa3e80ba517070703339952089be"} Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.202952 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.077001054 podStartE2EDuration="4.202936486s" podCreationTimestamp="2025-11-26 12:51:35 +0000 UTC" firstStartedPulling="2025-11-26 12:51:36.840656798 +0000 UTC m=+2394.747870150" lastFinishedPulling="2025-11-26 12:51:37.96659223 +0000 UTC m=+2395.873805582" observedRunningTime="2025-11-26 12:51:39.202696202 +0000 UTC m=+2397.109909554" watchObservedRunningTime="2025-11-26 12:51:39.202936486 +0000 UTC m=+2397.110149838" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.224578 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.224565602 podStartE2EDuration="4.224565602s" podCreationTimestamp="2025-11-26 12:51:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:51:39.223760724 +0000 UTC m=+2397.130974076" watchObservedRunningTime="2025-11-26 12:51:39.224565602 +0000 UTC m=+2397.131778954" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.268163 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.268146504 podStartE2EDuration="4.268146504s" podCreationTimestamp="2025-11-26 12:51:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:51:39.264630352 +0000 UTC m=+2397.171843703" watchObservedRunningTime="2025-11-26 12:51:39.268146504 +0000 UTC m=+2397.175359855" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.271119 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.1798856620000002 podStartE2EDuration="4.271111396s" podCreationTimestamp="2025-11-26 12:51:35 +0000 UTC" firstStartedPulling="2025-11-26 12:51:36.743530909 +0000 UTC m=+2394.650744261" lastFinishedPulling="2025-11-26 12:51:37.834756642 +0000 UTC m=+2395.741969995" observedRunningTime="2025-11-26 12:51:39.249957816 +0000 UTC m=+2397.157171169" watchObservedRunningTime="2025-11-26 12:51:39.271111396 +0000 UTC m=+2397.178324748" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.418128 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:51:39 crc kubenswrapper[4834]: E1126 12:51:39.418998 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.582480 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.586981 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.737799 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x84f\" (UniqueName: \"kubernetes.io/projected/915f4671-4ace-4684-b095-75ee89fc9c7b-kube-api-access-8x84f\") pod \"915f4671-4ace-4684-b095-75ee89fc9c7b\" (UID: \"915f4671-4ace-4684-b095-75ee89fc9c7b\") " Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.737968 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llhbr\" (UniqueName: \"kubernetes.io/projected/69f2a300-39e4-4bfc-bb9a-5646fe44709c-kube-api-access-llhbr\") pod \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\" (UID: \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\") " Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.738143 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915f4671-4ace-4684-b095-75ee89fc9c7b-operator-scripts\") pod \"915f4671-4ace-4684-b095-75ee89fc9c7b\" (UID: \"915f4671-4ace-4684-b095-75ee89fc9c7b\") " Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.738168 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f2a300-39e4-4bfc-bb9a-5646fe44709c-operator-scripts\") pod \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\" (UID: \"69f2a300-39e4-4bfc-bb9a-5646fe44709c\") " Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.738670 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/915f4671-4ace-4684-b095-75ee89fc9c7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "915f4671-4ace-4684-b095-75ee89fc9c7b" (UID: "915f4671-4ace-4684-b095-75ee89fc9c7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.738672 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f2a300-39e4-4bfc-bb9a-5646fe44709c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69f2a300-39e4-4bfc-bb9a-5646fe44709c" (UID: "69f2a300-39e4-4bfc-bb9a-5646fe44709c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.743930 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915f4671-4ace-4684-b095-75ee89fc9c7b-kube-api-access-8x84f" (OuterVolumeSpecName: "kube-api-access-8x84f") pod "915f4671-4ace-4684-b095-75ee89fc9c7b" (UID: "915f4671-4ace-4684-b095-75ee89fc9c7b"). InnerVolumeSpecName "kube-api-access-8x84f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.743959 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f2a300-39e4-4bfc-bb9a-5646fe44709c-kube-api-access-llhbr" (OuterVolumeSpecName: "kube-api-access-llhbr") pod "69f2a300-39e4-4bfc-bb9a-5646fe44709c" (UID: "69f2a300-39e4-4bfc-bb9a-5646fe44709c"). InnerVolumeSpecName "kube-api-access-llhbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.840993 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915f4671-4ace-4684-b095-75ee89fc9c7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.841199 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f2a300-39e4-4bfc-bb9a-5646fe44709c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.841208 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x84f\" (UniqueName: \"kubernetes.io/projected/915f4671-4ace-4684-b095-75ee89fc9c7b-kube-api-access-8x84f\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:39 crc kubenswrapper[4834]: I1126 12:51:39.841219 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llhbr\" (UniqueName: \"kubernetes.io/projected/69f2a300-39e4-4bfc-bb9a-5646fe44709c-kube-api-access-llhbr\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:40 crc kubenswrapper[4834]: I1126 12:51:40.188418 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-246d-account-create-update-9jk9p" event={"ID":"915f4671-4ace-4684-b095-75ee89fc9c7b","Type":"ContainerDied","Data":"c28c661d4e1f998b517508fb9ecef2ebd3fa626e2811fdca9e2f54b9d46cf61d"} Nov 26 12:51:40 crc kubenswrapper[4834]: I1126 12:51:40.188445 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-246d-account-create-update-9jk9p" Nov 26 12:51:40 crc kubenswrapper[4834]: I1126 12:51:40.188462 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c28c661d4e1f998b517508fb9ecef2ebd3fa626e2811fdca9e2f54b9d46cf61d" Nov 26 12:51:40 crc kubenswrapper[4834]: I1126 12:51:40.189764 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-b2qmr" event={"ID":"69f2a300-39e4-4bfc-bb9a-5646fe44709c","Type":"ContainerDied","Data":"4c1e2f01f9e2affdde5fe0367e6e4ef613e60c5286784d966fe5a4a200ea3b31"} Nov 26 12:51:40 crc kubenswrapper[4834]: I1126 12:51:40.189835 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c1e2f01f9e2affdde5fe0367e6e4ef613e60c5286784d966fe5a4a200ea3b31" Nov 26 12:51:40 crc kubenswrapper[4834]: I1126 12:51:40.189776 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-b2qmr" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.258179 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.265876 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.868803 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-4bj4f"] Nov 26 12:51:41 crc kubenswrapper[4834]: E1126 12:51:41.869241 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915f4671-4ace-4684-b095-75ee89fc9c7b" containerName="mariadb-account-create-update" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.869278 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="915f4671-4ace-4684-b095-75ee89fc9c7b" containerName="mariadb-account-create-update" Nov 26 12:51:41 crc kubenswrapper[4834]: E1126 12:51:41.869303 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f2a300-39e4-4bfc-bb9a-5646fe44709c" containerName="mariadb-database-create" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.869339 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f2a300-39e4-4bfc-bb9a-5646fe44709c" containerName="mariadb-database-create" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.869540 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="915f4671-4ace-4684-b095-75ee89fc9c7b" containerName="mariadb-account-create-update" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.869561 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f2a300-39e4-4bfc-bb9a-5646fe44709c" containerName="mariadb-database-create" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.870236 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.871906 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-cxnh5" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.872125 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.878168 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-4bj4f"] Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.981306 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-job-config-data\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.981582 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-combined-ca-bundle\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.981635 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-config-data\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:41 crc kubenswrapper[4834]: I1126 12:51:41.981661 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfpxr\" (UniqueName: \"kubernetes.io/projected/b67b910b-d864-4d0f-9f34-921e1cdd0517-kube-api-access-xfpxr\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.083148 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-job-config-data\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.083216 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-combined-ca-bundle\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.083255 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-config-data\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.083294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfpxr\" (UniqueName: \"kubernetes.io/projected/b67b910b-d864-4d0f-9f34-921e1cdd0517-kube-api-access-xfpxr\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.089712 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-job-config-data\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.091405 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-config-data\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.091901 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-combined-ca-bundle\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.097004 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfpxr\" (UniqueName: \"kubernetes.io/projected/b67b910b-d864-4d0f-9f34-921e1cdd0517-kube-api-access-xfpxr\") pod \"manila-db-sync-4bj4f\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.186438 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:42 crc kubenswrapper[4834]: W1126 12:51:42.672474 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb67b910b_d864_4d0f_9f34_921e1cdd0517.slice/crio-c61b35ae656770096d2012a7c66263f1db976d01a42fd3f4eb3da68b6a001017 WatchSource:0}: Error finding container c61b35ae656770096d2012a7c66263f1db976d01a42fd3f4eb3da68b6a001017: Status 404 returned error can't find the container with id c61b35ae656770096d2012a7c66263f1db976d01a42fd3f4eb3da68b6a001017 Nov 26 12:51:42 crc kubenswrapper[4834]: I1126 12:51:42.672816 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-4bj4f"] Nov 26 12:51:43 crc kubenswrapper[4834]: I1126 12:51:43.219767 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-4bj4f" event={"ID":"b67b910b-d864-4d0f-9f34-921e1cdd0517","Type":"ContainerStarted","Data":"c61b35ae656770096d2012a7c66263f1db976d01a42fd3f4eb3da68b6a001017"} Nov 26 12:51:46 crc kubenswrapper[4834]: I1126 12:51:46.455956 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Nov 26 12:51:46 crc kubenswrapper[4834]: I1126 12:51:46.512716 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.138940 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.139193 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.167951 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.178709 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.251477 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.251604 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.399047 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.399394 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.485854 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 12:51:47 crc kubenswrapper[4834]: I1126 12:51:47.494768 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 12:51:48 crc kubenswrapper[4834]: I1126 12:51:48.261156 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-4bj4f" event={"ID":"b67b910b-d864-4d0f-9f34-921e1cdd0517","Type":"ContainerStarted","Data":"fe22f91d0777809dad4c61f6974fe149d36bf20729102bdb45d24af0085ebf53"} Nov 26 12:51:48 crc kubenswrapper[4834]: I1126 12:51:48.262054 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 26 12:51:48 crc kubenswrapper[4834]: I1126 12:51:48.262129 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 26 12:51:48 crc kubenswrapper[4834]: I1126 12:51:48.285679 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-4bj4f" podStartSLOduration=2.92726791 podStartE2EDuration="7.285662869s" podCreationTimestamp="2025-11-26 12:51:41 +0000 UTC" firstStartedPulling="2025-11-26 12:51:42.674632588 +0000 UTC m=+2400.581845940" lastFinishedPulling="2025-11-26 12:51:47.033027556 +0000 UTC m=+2404.940240899" observedRunningTime="2025-11-26 12:51:48.278789234 +0000 UTC m=+2406.186002587" watchObservedRunningTime="2025-11-26 12:51:48.285662869 +0000 UTC m=+2406.192876221" Nov 26 12:51:49 crc kubenswrapper[4834]: I1126 12:51:49.321423 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:49 crc kubenswrapper[4834]: I1126 12:51:49.321848 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 12:51:49 crc kubenswrapper[4834]: I1126 12:51:49.323079 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 12:51:49 crc kubenswrapper[4834]: I1126 12:51:49.984722 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 12:51:50 crc kubenswrapper[4834]: I1126 12:51:50.016306 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 12:51:52 crc kubenswrapper[4834]: I1126 12:51:52.292097 4834 generic.go:334] "Generic (PLEG): container finished" podID="b67b910b-d864-4d0f-9f34-921e1cdd0517" containerID="fe22f91d0777809dad4c61f6974fe149d36bf20729102bdb45d24af0085ebf53" exitCode=0 Nov 26 12:51:52 crc kubenswrapper[4834]: I1126 12:51:52.292183 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-4bj4f" event={"ID":"b67b910b-d864-4d0f-9f34-921e1cdd0517","Type":"ContainerDied","Data":"fe22f91d0777809dad4c61f6974fe149d36bf20729102bdb45d24af0085ebf53"} Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.632064 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.647626 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfpxr\" (UniqueName: \"kubernetes.io/projected/b67b910b-d864-4d0f-9f34-921e1cdd0517-kube-api-access-xfpxr\") pod \"b67b910b-d864-4d0f-9f34-921e1cdd0517\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.647727 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-config-data\") pod \"b67b910b-d864-4d0f-9f34-921e1cdd0517\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.647803 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-combined-ca-bundle\") pod \"b67b910b-d864-4d0f-9f34-921e1cdd0517\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.647893 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-job-config-data\") pod \"b67b910b-d864-4d0f-9f34-921e1cdd0517\" (UID: \"b67b910b-d864-4d0f-9f34-921e1cdd0517\") " Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.654410 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b67b910b-d864-4d0f-9f34-921e1cdd0517-kube-api-access-xfpxr" (OuterVolumeSpecName: "kube-api-access-xfpxr") pod "b67b910b-d864-4d0f-9f34-921e1cdd0517" (UID: "b67b910b-d864-4d0f-9f34-921e1cdd0517"). InnerVolumeSpecName "kube-api-access-xfpxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.654961 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "b67b910b-d864-4d0f-9f34-921e1cdd0517" (UID: "b67b910b-d864-4d0f-9f34-921e1cdd0517"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.656346 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-config-data" (OuterVolumeSpecName: "config-data") pod "b67b910b-d864-4d0f-9f34-921e1cdd0517" (UID: "b67b910b-d864-4d0f-9f34-921e1cdd0517"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.675736 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b67b910b-d864-4d0f-9f34-921e1cdd0517" (UID: "b67b910b-d864-4d0f-9f34-921e1cdd0517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.749281 4834 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-job-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.749406 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfpxr\" (UniqueName: \"kubernetes.io/projected/b67b910b-d864-4d0f-9f34-921e1cdd0517-kube-api-access-xfpxr\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.749472 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:53 crc kubenswrapper[4834]: I1126 12:51:53.749525 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b910b-d864-4d0f-9f34-921e1cdd0517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.315405 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-4bj4f" event={"ID":"b67b910b-d864-4d0f-9f34-921e1cdd0517","Type":"ContainerDied","Data":"c61b35ae656770096d2012a7c66263f1db976d01a42fd3f4eb3da68b6a001017"} Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.315734 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c61b35ae656770096d2012a7c66263f1db976d01a42fd3f4eb3da68b6a001017" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.315530 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-4bj4f" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.417412 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:51:54 crc kubenswrapper[4834]: E1126 12:51:54.417715 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.632725 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 12:51:54 crc kubenswrapper[4834]: E1126 12:51:54.633287 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67b910b-d864-4d0f-9f34-921e1cdd0517" containerName="manila-db-sync" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.633322 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67b910b-d864-4d0f-9f34-921e1cdd0517" containerName="manila-db-sync" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.633591 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b67b910b-d864-4d0f-9f34-921e1cdd0517" containerName="manila-db-sync" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.639036 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.643363 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-cxnh5" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.643609 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.643772 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.643824 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.644163 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.664546 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.665008 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.665112 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfbbb\" (UniqueName: \"kubernetes.io/projected/7ec328cd-60df-4d12-96be-5ed62485724b-kube-api-access-dfbbb\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.665197 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec328cd-60df-4d12-96be-5ed62485724b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.665279 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.665683 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.665787 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-scripts\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.666022 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.669085 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.678534 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.713261 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78f48d6b7c-vglzt"] Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.727593 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.765611 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78f48d6b7c-vglzt"] Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768323 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-ovsdbserver-sb\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768438 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768481 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768516 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-ceph\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768543 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768612 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768634 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-scripts\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768681 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfssb\" (UniqueName: \"kubernetes.io/projected/94a582ef-1398-4fa2-afa2-2627ebc94e06-kube-api-access-lfssb\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768710 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768747 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768783 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbbb\" (UniqueName: \"kubernetes.io/projected/7ec328cd-60df-4d12-96be-5ed62485724b-kube-api-access-dfbbb\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768805 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-scripts\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768844 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqzpx\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-kube-api-access-zqzpx\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768863 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec328cd-60df-4d12-96be-5ed62485724b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768889 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768913 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.768938 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-config\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.769001 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-ovsdbserver-nb\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.769083 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-dns-svc\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.769108 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-openstack-edpm-ipam\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.775646 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-scripts\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.775917 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.776305 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec328cd-60df-4d12-96be-5ed62485724b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.780975 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.787522 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.794989 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfbbb\" (UniqueName: \"kubernetes.io/projected/7ec328cd-60df-4d12-96be-5ed62485724b-kube-api-access-dfbbb\") pod \"manila-scheduler-0\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.857540 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.859710 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.862068 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.865467 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.871740 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.871825 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfssb\" (UniqueName: \"kubernetes.io/projected/94a582ef-1398-4fa2-afa2-2627ebc94e06-kube-api-access-lfssb\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872083 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872161 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872201 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-scripts\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872240 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqzpx\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-kube-api-access-zqzpx\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872277 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872321 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-config\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872385 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-ovsdbserver-nb\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872453 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-dns-svc\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872476 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-openstack-edpm-ipam\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872542 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-ovsdbserver-sb\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872601 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872637 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-ceph\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.872660 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.874104 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-ovsdbserver-nb\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.874937 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-ovsdbserver-sb\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.874940 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.875553 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-openstack-edpm-ipam\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.875650 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-dns-svc\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.875771 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a582ef-1398-4fa2-afa2-2627ebc94e06-config\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.876084 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.877403 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-scripts\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.884158 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-ceph\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.886470 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.891931 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.898010 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfssb\" (UniqueName: \"kubernetes.io/projected/94a582ef-1398-4fa2-afa2-2627ebc94e06-kube-api-access-lfssb\") pod \"dnsmasq-dns-78f48d6b7c-vglzt\" (UID: \"94a582ef-1398-4fa2-afa2-2627ebc94e06\") " pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.898520 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqzpx\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-kube-api-access-zqzpx\") pod \"manila-share-share1-0\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " pod="openstack/manila-share-share1-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.961821 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.974205 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data-custom\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.974410 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6835f243-4f76-4e56-a99a-77ba34fbde14-logs\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.974494 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gls\" (UniqueName: \"kubernetes.io/projected/6835f243-4f76-4e56-a99a-77ba34fbde14-kube-api-access-s7gls\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.974610 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.974737 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.974867 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6835f243-4f76-4e56-a99a-77ba34fbde14-etc-machine-id\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.974901 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-scripts\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:54 crc kubenswrapper[4834]: I1126 12:51:54.987731 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.052665 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.076813 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6835f243-4f76-4e56-a99a-77ba34fbde14-logs\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.076866 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gls\" (UniqueName: \"kubernetes.io/projected/6835f243-4f76-4e56-a99a-77ba34fbde14-kube-api-access-s7gls\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.076923 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.076955 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.076996 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6835f243-4f76-4e56-a99a-77ba34fbde14-etc-machine-id\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.077015 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-scripts\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.077039 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data-custom\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.077507 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6835f243-4f76-4e56-a99a-77ba34fbde14-logs\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.079560 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6835f243-4f76-4e56-a99a-77ba34fbde14-etc-machine-id\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.081893 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data-custom\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.083011 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-scripts\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.086106 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.086565 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.108200 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gls\" (UniqueName: \"kubernetes.io/projected/6835f243-4f76-4e56-a99a-77ba34fbde14-kube-api-access-s7gls\") pod \"manila-api-0\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.261741 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.439620 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.586953 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 26 12:51:55 crc kubenswrapper[4834]: W1126 12:51:55.590427 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6835f243_4f76_4e56_a99a_77ba34fbde14.slice/crio-ab51f148cb75d881824a432c8fa92604a4c9ee6a7d695a564987ee4cbaab0e4c WatchSource:0}: Error finding container ab51f148cb75d881824a432c8fa92604a4c9ee6a7d695a564987ee4cbaab0e4c: Status 404 returned error can't find the container with id ab51f148cb75d881824a432c8fa92604a4c9ee6a7d695a564987ee4cbaab0e4c Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.621382 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78f48d6b7c-vglzt"] Nov 26 12:51:55 crc kubenswrapper[4834]: W1126 12:51:55.626619 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94a582ef_1398_4fa2_afa2_2627ebc94e06.slice/crio-2253e032193f387b4bbc102502a420979fc4cf5f3e8a471766aa1d40d34e1905 WatchSource:0}: Error finding container 2253e032193f387b4bbc102502a420979fc4cf5f3e8a471766aa1d40d34e1905: Status 404 returned error can't find the container with id 2253e032193f387b4bbc102502a420979fc4cf5f3e8a471766aa1d40d34e1905 Nov 26 12:51:55 crc kubenswrapper[4834]: I1126 12:51:55.775395 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 12:51:56 crc kubenswrapper[4834]: I1126 12:51:56.343384 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7ec328cd-60df-4d12-96be-5ed62485724b","Type":"ContainerStarted","Data":"97d83aac19ffb24e76465a8d144e24200544a635193f1aa1d93f7ec040f4712f"} Nov 26 12:51:56 crc kubenswrapper[4834]: I1126 12:51:56.345507 4834 generic.go:334] "Generic (PLEG): container finished" podID="94a582ef-1398-4fa2-afa2-2627ebc94e06" containerID="270d223bc68d75a321124080c06f5c591561eb394701f6b5687a3a1e6e2859bc" exitCode=0 Nov 26 12:51:56 crc kubenswrapper[4834]: I1126 12:51:56.345577 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" event={"ID":"94a582ef-1398-4fa2-afa2-2627ebc94e06","Type":"ContainerDied","Data":"270d223bc68d75a321124080c06f5c591561eb394701f6b5687a3a1e6e2859bc"} Nov 26 12:51:56 crc kubenswrapper[4834]: I1126 12:51:56.345603 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" event={"ID":"94a582ef-1398-4fa2-afa2-2627ebc94e06","Type":"ContainerStarted","Data":"2253e032193f387b4bbc102502a420979fc4cf5f3e8a471766aa1d40d34e1905"} Nov 26 12:51:56 crc kubenswrapper[4834]: I1126 12:51:56.346546 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9e538d8f-eaf6-4f97-a773-1d965586a83d","Type":"ContainerStarted","Data":"67f78632d7fb7330f59f449a8c4b7debbd39e5940ad4cc9b02b03ac8320d5ec7"} Nov 26 12:51:56 crc kubenswrapper[4834]: I1126 12:51:56.350653 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6835f243-4f76-4e56-a99a-77ba34fbde14","Type":"ContainerStarted","Data":"0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347"} Nov 26 12:51:56 crc kubenswrapper[4834]: I1126 12:51:56.350682 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6835f243-4f76-4e56-a99a-77ba34fbde14","Type":"ContainerStarted","Data":"ab51f148cb75d881824a432c8fa92604a4c9ee6a7d695a564987ee4cbaab0e4c"} Nov 26 12:51:57 crc kubenswrapper[4834]: I1126 12:51:57.329647 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Nov 26 12:51:57 crc kubenswrapper[4834]: I1126 12:51:57.362011 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6835f243-4f76-4e56-a99a-77ba34fbde14","Type":"ContainerStarted","Data":"29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b"} Nov 26 12:51:57 crc kubenswrapper[4834]: I1126 12:51:57.362131 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 26 12:51:57 crc kubenswrapper[4834]: I1126 12:51:57.364206 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7ec328cd-60df-4d12-96be-5ed62485724b","Type":"ContainerStarted","Data":"e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef"} Nov 26 12:51:57 crc kubenswrapper[4834]: I1126 12:51:57.364249 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7ec328cd-60df-4d12-96be-5ed62485724b","Type":"ContainerStarted","Data":"37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20"} Nov 26 12:51:57 crc kubenswrapper[4834]: I1126 12:51:57.366160 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" event={"ID":"94a582ef-1398-4fa2-afa2-2627ebc94e06","Type":"ContainerStarted","Data":"0a81cf30ebc364e50233aedeb59b4d10063a47371e5e5029455e232457163519"} Nov 26 12:51:57 crc kubenswrapper[4834]: I1126 12:51:57.418626 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.418604179 podStartE2EDuration="3.418604179s" podCreationTimestamp="2025-11-26 12:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:51:57.375823525 +0000 UTC m=+2415.283036878" watchObservedRunningTime="2025-11-26 12:51:57.418604179 +0000 UTC m=+2415.325817531" Nov 26 12:51:57 crc kubenswrapper[4834]: I1126 12:51:57.448652 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.432616384 podStartE2EDuration="3.448627268s" podCreationTimestamp="2025-11-26 12:51:54 +0000 UTC" firstStartedPulling="2025-11-26 12:51:55.449804029 +0000 UTC m=+2413.357017381" lastFinishedPulling="2025-11-26 12:51:56.465814912 +0000 UTC m=+2414.373028265" observedRunningTime="2025-11-26 12:51:57.409973379 +0000 UTC m=+2415.317186731" watchObservedRunningTime="2025-11-26 12:51:57.448627268 +0000 UTC m=+2415.355840620" Nov 26 12:51:57 crc kubenswrapper[4834]: I1126 12:51:57.488473 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" podStartSLOduration=3.48845469 podStartE2EDuration="3.48845469s" podCreationTimestamp="2025-11-26 12:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:51:57.436811962 +0000 UTC m=+2415.344025314" watchObservedRunningTime="2025-11-26 12:51:57.48845469 +0000 UTC m=+2415.395668042" Nov 26 12:51:58 crc kubenswrapper[4834]: I1126 12:51:58.379471 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerName="manila-api-log" containerID="cri-o://0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347" gracePeriod=30 Nov 26 12:51:58 crc kubenswrapper[4834]: I1126 12:51:58.379961 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerName="manila-api" containerID="cri-o://29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b" gracePeriod=30 Nov 26 12:51:58 crc kubenswrapper[4834]: I1126 12:51:58.379495 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.008751 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.098400 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-scripts\") pod \"6835f243-4f76-4e56-a99a-77ba34fbde14\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.098461 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data\") pod \"6835f243-4f76-4e56-a99a-77ba34fbde14\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.098502 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7gls\" (UniqueName: \"kubernetes.io/projected/6835f243-4f76-4e56-a99a-77ba34fbde14-kube-api-access-s7gls\") pod \"6835f243-4f76-4e56-a99a-77ba34fbde14\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.098521 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6835f243-4f76-4e56-a99a-77ba34fbde14-logs\") pod \"6835f243-4f76-4e56-a99a-77ba34fbde14\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.098537 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6835f243-4f76-4e56-a99a-77ba34fbde14-etc-machine-id\") pod \"6835f243-4f76-4e56-a99a-77ba34fbde14\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.098559 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-combined-ca-bundle\") pod \"6835f243-4f76-4e56-a99a-77ba34fbde14\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.098602 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data-custom\") pod \"6835f243-4f76-4e56-a99a-77ba34fbde14\" (UID: \"6835f243-4f76-4e56-a99a-77ba34fbde14\") " Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.099491 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6835f243-4f76-4e56-a99a-77ba34fbde14-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6835f243-4f76-4e56-a99a-77ba34fbde14" (UID: "6835f243-4f76-4e56-a99a-77ba34fbde14"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.099607 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.099706 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6835f243-4f76-4e56-a99a-77ba34fbde14-logs" (OuterVolumeSpecName: "logs") pod "6835f243-4f76-4e56-a99a-77ba34fbde14" (UID: "6835f243-4f76-4e56-a99a-77ba34fbde14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.099893 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="ceilometer-central-agent" containerID="cri-o://46c93440e4ac0eb96b1e7fe0093787ca8eb13cb9ee2b3c51ad9fd8ef58547ac4" gracePeriod=30 Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.100347 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="proxy-httpd" containerID="cri-o://2fe67af5044e399a1232bfd4f6fb6c9de728cd2e85844e9740d0ac3a6740c439" gracePeriod=30 Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.100429 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="sg-core" containerID="cri-o://c8277eb9332b889ead7b3e2b767467cbce54b54d72cf35b6ded93da975bc0ac6" gracePeriod=30 Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.100429 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="ceilometer-notification-agent" containerID="cri-o://eef99731cc48444f1922b330426628292b03d4929e0c4a3ee6c56eaa91d023df" gracePeriod=30 Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.105589 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-scripts" (OuterVolumeSpecName: "scripts") pod "6835f243-4f76-4e56-a99a-77ba34fbde14" (UID: "6835f243-4f76-4e56-a99a-77ba34fbde14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.109431 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6835f243-4f76-4e56-a99a-77ba34fbde14" (UID: "6835f243-4f76-4e56-a99a-77ba34fbde14"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.111293 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6835f243-4f76-4e56-a99a-77ba34fbde14-kube-api-access-s7gls" (OuterVolumeSpecName: "kube-api-access-s7gls") pod "6835f243-4f76-4e56-a99a-77ba34fbde14" (UID: "6835f243-4f76-4e56-a99a-77ba34fbde14"). InnerVolumeSpecName "kube-api-access-s7gls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.137943 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6835f243-4f76-4e56-a99a-77ba34fbde14" (UID: "6835f243-4f76-4e56-a99a-77ba34fbde14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.159336 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data" (OuterVolumeSpecName: "config-data") pod "6835f243-4f76-4e56-a99a-77ba34fbde14" (UID: "6835f243-4f76-4e56-a99a-77ba34fbde14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.202471 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.202517 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.202528 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.202539 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6835f243-4f76-4e56-a99a-77ba34fbde14-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.202551 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7gls\" (UniqueName: \"kubernetes.io/projected/6835f243-4f76-4e56-a99a-77ba34fbde14-kube-api-access-s7gls\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.202678 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6835f243-4f76-4e56-a99a-77ba34fbde14-logs\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.202691 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6835f243-4f76-4e56-a99a-77ba34fbde14-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.411293 4834 generic.go:334] "Generic (PLEG): container finished" podID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerID="29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b" exitCode=0 Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.411336 4834 generic.go:334] "Generic (PLEG): container finished" podID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerID="0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347" exitCode=143 Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.411382 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6835f243-4f76-4e56-a99a-77ba34fbde14","Type":"ContainerDied","Data":"29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b"} Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.411409 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6835f243-4f76-4e56-a99a-77ba34fbde14","Type":"ContainerDied","Data":"0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347"} Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.411419 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6835f243-4f76-4e56-a99a-77ba34fbde14","Type":"ContainerDied","Data":"ab51f148cb75d881824a432c8fa92604a4c9ee6a7d695a564987ee4cbaab0e4c"} Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.411434 4834 scope.go:117] "RemoveContainer" containerID="29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.411551 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.438201 4834 generic.go:334] "Generic (PLEG): container finished" podID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerID="2fe67af5044e399a1232bfd4f6fb6c9de728cd2e85844e9740d0ac3a6740c439" exitCode=0 Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.438229 4834 generic.go:334] "Generic (PLEG): container finished" podID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerID="c8277eb9332b889ead7b3e2b767467cbce54b54d72cf35b6ded93da975bc0ac6" exitCode=2 Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.438795 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerDied","Data":"2fe67af5044e399a1232bfd4f6fb6c9de728cd2e85844e9740d0ac3a6740c439"} Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.438826 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerDied","Data":"c8277eb9332b889ead7b3e2b767467cbce54b54d72cf35b6ded93da975bc0ac6"} Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.470205 4834 scope.go:117] "RemoveContainer" containerID="0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.470298 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.474456 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.497160 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 26 12:51:59 crc kubenswrapper[4834]: E1126 12:51:59.497487 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerName="manila-api-log" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.497499 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerName="manila-api-log" Nov 26 12:51:59 crc kubenswrapper[4834]: E1126 12:51:59.497525 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerName="manila-api" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.497532 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerName="manila-api" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.497684 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerName="manila-api-log" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.497711 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6835f243-4f76-4e56-a99a-77ba34fbde14" containerName="manila-api" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.498514 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.501446 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.506544 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.506831 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.508039 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.508093 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dcf49f7-938b-417a-91c0-52cbd58f8c62-logs\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.508121 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-config-data-custom\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.508140 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-scripts\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.508160 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9dcf49f7-938b-417a-91c0-52cbd58f8c62-etc-machine-id\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.508210 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-config-data\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.508258 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-public-tls-certs\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.508288 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-internal-tls-certs\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.508399 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvdz5\" (UniqueName: \"kubernetes.io/projected/9dcf49f7-938b-417a-91c0-52cbd58f8c62-kube-api-access-qvdz5\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.518918 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.580479 4834 scope.go:117] "RemoveContainer" containerID="29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b" Nov 26 12:51:59 crc kubenswrapper[4834]: E1126 12:51:59.585100 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b\": container with ID starting with 29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b not found: ID does not exist" containerID="29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.585612 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b"} err="failed to get container status \"29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b\": rpc error: code = NotFound desc = could not find container \"29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b\": container with ID starting with 29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b not found: ID does not exist" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.585675 4834 scope.go:117] "RemoveContainer" containerID="0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347" Nov 26 12:51:59 crc kubenswrapper[4834]: E1126 12:51:59.592594 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347\": container with ID starting with 0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347 not found: ID does not exist" containerID="0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.592648 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347"} err="failed to get container status \"0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347\": rpc error: code = NotFound desc = could not find container \"0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347\": container with ID starting with 0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347 not found: ID does not exist" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.592697 4834 scope.go:117] "RemoveContainer" containerID="29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.594561 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b"} err="failed to get container status \"29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b\": rpc error: code = NotFound desc = could not find container \"29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b\": container with ID starting with 29aab5e16879120da058269dafa15063d4c34161a5f5415f0bea3fe0f044e66b not found: ID does not exist" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.594590 4834 scope.go:117] "RemoveContainer" containerID="0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.595340 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347"} err="failed to get container status \"0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347\": rpc error: code = NotFound desc = could not find container \"0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347\": container with ID starting with 0dc15cd30e04cae3cdf0ed6f7d702d570f51bfd47aa7bf99a8e01ca048c9e347 not found: ID does not exist" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.612492 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvdz5\" (UniqueName: \"kubernetes.io/projected/9dcf49f7-938b-417a-91c0-52cbd58f8c62-kube-api-access-qvdz5\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.612846 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.612913 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-config-data-custom\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.612936 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dcf49f7-938b-417a-91c0-52cbd58f8c62-logs\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.612978 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-scripts\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.613020 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9dcf49f7-938b-417a-91c0-52cbd58f8c62-etc-machine-id\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.613081 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-config-data\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.613154 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-public-tls-certs\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.613206 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-internal-tls-certs\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.616361 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9dcf49f7-938b-417a-91c0-52cbd58f8c62-etc-machine-id\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.619511 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-scripts\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.619828 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dcf49f7-938b-417a-91c0-52cbd58f8c62-logs\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.622939 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-public-tls-certs\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.624436 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.627131 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-internal-tls-certs\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.627403 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-config-data\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.627813 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dcf49f7-938b-417a-91c0-52cbd58f8c62-config-data-custom\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.631873 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvdz5\" (UniqueName: \"kubernetes.io/projected/9dcf49f7-938b-417a-91c0-52cbd58f8c62-kube-api-access-qvdz5\") pod \"manila-api-0\" (UID: \"9dcf49f7-938b-417a-91c0-52cbd58f8c62\") " pod="openstack/manila-api-0" Nov 26 12:51:59 crc kubenswrapper[4834]: I1126 12:51:59.875232 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 26 12:52:00 crc kubenswrapper[4834]: I1126 12:52:00.428581 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6835f243-4f76-4e56-a99a-77ba34fbde14" path="/var/lib/kubelet/pods/6835f243-4f76-4e56-a99a-77ba34fbde14/volumes" Nov 26 12:52:00 crc kubenswrapper[4834]: I1126 12:52:00.450949 4834 generic.go:334] "Generic (PLEG): container finished" podID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerID="46c93440e4ac0eb96b1e7fe0093787ca8eb13cb9ee2b3c51ad9fd8ef58547ac4" exitCode=0 Nov 26 12:52:00 crc kubenswrapper[4834]: I1126 12:52:00.451005 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerDied","Data":"46c93440e4ac0eb96b1e7fe0093787ca8eb13cb9ee2b3c51ad9fd8ef58547ac4"} Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.481566 4834 generic.go:334] "Generic (PLEG): container finished" podID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerID="eef99731cc48444f1922b330426628292b03d4929e0c4a3ee6c56eaa91d023df" exitCode=0 Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.481646 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerDied","Data":"eef99731cc48444f1922b330426628292b03d4929e0c4a3ee6c56eaa91d023df"} Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.736979 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.812526 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-combined-ca-bundle\") pod \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.812702 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-run-httpd\") pod \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.812845 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-sg-core-conf-yaml\") pod \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.812949 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-ceilometer-tls-certs\") pod \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.813035 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-log-httpd\") pod \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.813165 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzcz8\" (UniqueName: \"kubernetes.io/projected/0f3d377f-7b47-4237-8ea4-c697d52f30c8-kube-api-access-nzcz8\") pod \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.813292 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-config-data\") pod \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.813396 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-scripts\") pod \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\" (UID: \"0f3d377f-7b47-4237-8ea4-c697d52f30c8\") " Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.813349 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0f3d377f-7b47-4237-8ea4-c697d52f30c8" (UID: "0f3d377f-7b47-4237-8ea4-c697d52f30c8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.813661 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0f3d377f-7b47-4237-8ea4-c697d52f30c8" (UID: "0f3d377f-7b47-4237-8ea4-c697d52f30c8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.822954 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-scripts" (OuterVolumeSpecName: "scripts") pod "0f3d377f-7b47-4237-8ea4-c697d52f30c8" (UID: "0f3d377f-7b47-4237-8ea4-c697d52f30c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.823172 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3d377f-7b47-4237-8ea4-c697d52f30c8-kube-api-access-nzcz8" (OuterVolumeSpecName: "kube-api-access-nzcz8") pod "0f3d377f-7b47-4237-8ea4-c697d52f30c8" (UID: "0f3d377f-7b47-4237-8ea4-c697d52f30c8"). InnerVolumeSpecName "kube-api-access-nzcz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.846906 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0f3d377f-7b47-4237-8ea4-c697d52f30c8" (UID: "0f3d377f-7b47-4237-8ea4-c697d52f30c8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.876384 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0f3d377f-7b47-4237-8ea4-c697d52f30c8" (UID: "0f3d377f-7b47-4237-8ea4-c697d52f30c8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.902563 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-config-data" (OuterVolumeSpecName: "config-data") pod "0f3d377f-7b47-4237-8ea4-c697d52f30c8" (UID: "0f3d377f-7b47-4237-8ea4-c697d52f30c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.903813 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f3d377f-7b47-4237-8ea4-c697d52f30c8" (UID: "0f3d377f-7b47-4237-8ea4-c697d52f30c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.915388 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.915414 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.915427 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.915436 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzcz8\" (UniqueName: \"kubernetes.io/projected/0f3d377f-7b47-4237-8ea4-c697d52f30c8-kube-api-access-nzcz8\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.915448 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.915456 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.915465 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3d377f-7b47-4237-8ea4-c697d52f30c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.915472 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f3d377f-7b47-4237-8ea4-c697d52f30c8-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:03 crc kubenswrapper[4834]: I1126 12:52:03.929737 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 26 12:52:03 crc kubenswrapper[4834]: W1126 12:52:03.930691 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dcf49f7_938b_417a_91c0_52cbd58f8c62.slice/crio-ebd7fa06e4003bfcc6ba13b1c5285d85b51150245de69838cbbab05f9d0c15cd WatchSource:0}: Error finding container ebd7fa06e4003bfcc6ba13b1c5285d85b51150245de69838cbbab05f9d0c15cd: Status 404 returned error can't find the container with id ebd7fa06e4003bfcc6ba13b1c5285d85b51150245de69838cbbab05f9d0c15cd Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.504442 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9dcf49f7-938b-417a-91c0-52cbd58f8c62","Type":"ContainerStarted","Data":"51a47c187fa78352513b55652ebd106f4754b199905322040bcc841e0b2064de"} Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.504723 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9dcf49f7-938b-417a-91c0-52cbd58f8c62","Type":"ContainerStarted","Data":"ebd7fa06e4003bfcc6ba13b1c5285d85b51150245de69838cbbab05f9d0c15cd"} Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.508295 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.508325 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f3d377f-7b47-4237-8ea4-c697d52f30c8","Type":"ContainerDied","Data":"b4b5e9057ebc9d766279f0e7a6ae9aff8df054022d02f67a23abdd7819a08241"} Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.508439 4834 scope.go:117] "RemoveContainer" containerID="2fe67af5044e399a1232bfd4f6fb6c9de728cd2e85844e9740d0ac3a6740c439" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.512493 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9e538d8f-eaf6-4f97-a773-1d965586a83d","Type":"ContainerStarted","Data":"ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26"} Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.512519 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9e538d8f-eaf6-4f97-a773-1d965586a83d","Type":"ContainerStarted","Data":"e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24"} Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.530815 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.541607 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.560986 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:04 crc kubenswrapper[4834]: E1126 12:52:04.563851 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="sg-core" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.564142 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="sg-core" Nov 26 12:52:04 crc kubenswrapper[4834]: E1126 12:52:04.564379 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="ceilometer-notification-agent" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.564679 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="ceilometer-notification-agent" Nov 26 12:52:04 crc kubenswrapper[4834]: E1126 12:52:04.564775 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="ceilometer-central-agent" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.564921 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="ceilometer-central-agent" Nov 26 12:52:04 crc kubenswrapper[4834]: E1126 12:52:04.565317 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="proxy-httpd" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.565783 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="proxy-httpd" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.567373 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="proxy-httpd" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.568869 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="ceilometer-notification-agent" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.568997 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="sg-core" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.569120 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" containerName="ceilometer-central-agent" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.568677 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.940206448 podStartE2EDuration="10.568663617s" podCreationTimestamp="2025-11-26 12:51:54 +0000 UTC" firstStartedPulling="2025-11-26 12:51:55.82041868 +0000 UTC m=+2413.727632033" lastFinishedPulling="2025-11-26 12:52:03.448875851 +0000 UTC m=+2421.356089202" observedRunningTime="2025-11-26 12:52:04.547445967 +0000 UTC m=+2422.454659318" watchObservedRunningTime="2025-11-26 12:52:04.568663617 +0000 UTC m=+2422.475876969" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.572740 4834 scope.go:117] "RemoveContainer" containerID="c8277eb9332b889ead7b3e2b767467cbce54b54d72cf35b6ded93da975bc0ac6" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.583532 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.585746 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.590022 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.590169 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.590590 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.610791 4834 scope.go:117] "RemoveContainer" containerID="eef99731cc48444f1922b330426628292b03d4929e0c4a3ee6c56eaa91d023df" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.627040 4834 scope.go:117] "RemoveContainer" containerID="46c93440e4ac0eb96b1e7fe0093787ca8eb13cb9ee2b3c51ad9fd8ef58547ac4" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.730388 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn8xl\" (UniqueName: \"kubernetes.io/projected/65398ed3-2d33-4693-8570-1d29e9c2ce52-kube-api-access-xn8xl\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.730441 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-run-httpd\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.730480 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-scripts\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.730502 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.730539 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.730559 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.730622 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-log-httpd\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.730650 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-config-data\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.834804 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-log-httpd\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.834978 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-config-data\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.835181 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn8xl\" (UniqueName: \"kubernetes.io/projected/65398ed3-2d33-4693-8570-1d29e9c2ce52-kube-api-access-xn8xl\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.835238 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-run-httpd\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.835323 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-scripts\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.835381 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.835472 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.835509 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.835573 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-log-httpd\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.839138 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-run-httpd\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.842064 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.842354 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-config-data\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.843187 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-scripts\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.846362 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.846454 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.856109 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn8xl\" (UniqueName: \"kubernetes.io/projected/65398ed3-2d33-4693-8570-1d29e9c2ce52-kube-api-access-xn8xl\") pod \"ceilometer-0\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.914617 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.962042 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 26 12:52:04 crc kubenswrapper[4834]: I1126 12:52:04.988636 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.054487 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78f48d6b7c-vglzt" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.108639 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c58867b6c-24msd"] Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.108849 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c58867b6c-24msd" podUID="f93c6d5e-760a-4fcf-881f-8132bc217c3d" containerName="dnsmasq-dns" containerID="cri-o://bb910519d02a4fb561360c59bf461b5e6c9ac9404ecea01044b038c46368c939" gracePeriod=10 Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.377295 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.521764 4834 generic.go:334] "Generic (PLEG): container finished" podID="f93c6d5e-760a-4fcf-881f-8132bc217c3d" containerID="bb910519d02a4fb561360c59bf461b5e6c9ac9404ecea01044b038c46368c939" exitCode=0 Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.521824 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c58867b6c-24msd" event={"ID":"f93c6d5e-760a-4fcf-881f-8132bc217c3d","Type":"ContainerDied","Data":"bb910519d02a4fb561360c59bf461b5e6c9ac9404ecea01044b038c46368c939"} Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.521851 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c58867b6c-24msd" event={"ID":"f93c6d5e-760a-4fcf-881f-8132bc217c3d","Type":"ContainerDied","Data":"203a1eb32a0c959b7d7d851b78df2738592699ccb30e8a092cc3ff800430df3e"} Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.521869 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="203a1eb32a0c959b7d7d851b78df2738592699ccb30e8a092cc3ff800430df3e" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.523686 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9dcf49f7-938b-417a-91c0-52cbd58f8c62","Type":"ContainerStarted","Data":"3df7caea5cfa512ba18848949473158dc14f9cdfe280e6aa9106d653039bc0d7"} Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.524465 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.527798 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerStarted","Data":"1f70ccd99b49b1cd1a2d93c8ca9c0b9064508ba9898653af592486370f190fbf"} Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.533823 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.544400 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.544382121 podStartE2EDuration="6.544382121s" podCreationTimestamp="2025-11-26 12:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:52:05.541657653 +0000 UTC m=+2423.448871005" watchObservedRunningTime="2025-11-26 12:52:05.544382121 +0000 UTC m=+2423.451595474" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.649054 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-openstack-edpm-ipam\") pod \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.649178 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-sb\") pod \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.650293 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-nb\") pod \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.650461 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-dns-svc\") pod \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.650519 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-config\") pod \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.650755 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zv7x\" (UniqueName: \"kubernetes.io/projected/f93c6d5e-760a-4fcf-881f-8132bc217c3d-kube-api-access-9zv7x\") pod \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\" (UID: \"f93c6d5e-760a-4fcf-881f-8132bc217c3d\") " Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.674304 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93c6d5e-760a-4fcf-881f-8132bc217c3d-kube-api-access-9zv7x" (OuterVolumeSpecName: "kube-api-access-9zv7x") pod "f93c6d5e-760a-4fcf-881f-8132bc217c3d" (UID: "f93c6d5e-760a-4fcf-881f-8132bc217c3d"). InnerVolumeSpecName "kube-api-access-9zv7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.702127 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f93c6d5e-760a-4fcf-881f-8132bc217c3d" (UID: "f93c6d5e-760a-4fcf-881f-8132bc217c3d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.707207 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f93c6d5e-760a-4fcf-881f-8132bc217c3d" (UID: "f93c6d5e-760a-4fcf-881f-8132bc217c3d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.718672 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f93c6d5e-760a-4fcf-881f-8132bc217c3d" (UID: "f93c6d5e-760a-4fcf-881f-8132bc217c3d"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.726452 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f93c6d5e-760a-4fcf-881f-8132bc217c3d" (UID: "f93c6d5e-760a-4fcf-881f-8132bc217c3d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.740057 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-config" (OuterVolumeSpecName: "config") pod "f93c6d5e-760a-4fcf-881f-8132bc217c3d" (UID: "f93c6d5e-760a-4fcf-881f-8132bc217c3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.754175 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.754224 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.754236 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.754263 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.754275 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93c6d5e-760a-4fcf-881f-8132bc217c3d-config\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:05 crc kubenswrapper[4834]: I1126 12:52:05.754285 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zv7x\" (UniqueName: \"kubernetes.io/projected/f93c6d5e-760a-4fcf-881f-8132bc217c3d-kube-api-access-9zv7x\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:06 crc kubenswrapper[4834]: I1126 12:52:06.425835 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3d377f-7b47-4237-8ea4-c697d52f30c8" path="/var/lib/kubelet/pods/0f3d377f-7b47-4237-8ea4-c697d52f30c8/volumes" Nov 26 12:52:06 crc kubenswrapper[4834]: I1126 12:52:06.535784 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c58867b6c-24msd" Nov 26 12:52:06 crc kubenswrapper[4834]: I1126 12:52:06.535833 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerStarted","Data":"0bf0ce18f609352cdcb05a7277e4cdd915c3b5f612786d8e3af1dd560a18b1bb"} Nov 26 12:52:06 crc kubenswrapper[4834]: I1126 12:52:06.558520 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c58867b6c-24msd"] Nov 26 12:52:06 crc kubenswrapper[4834]: I1126 12:52:06.564492 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c58867b6c-24msd"] Nov 26 12:52:07 crc kubenswrapper[4834]: I1126 12:52:07.549824 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerStarted","Data":"e11c2b84a2fb06454689e0703d3202f9fb5259d0b2baba30522ab4b30f358df7"} Nov 26 12:52:07 crc kubenswrapper[4834]: I1126 12:52:07.678477 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:08 crc kubenswrapper[4834]: I1126 12:52:08.417779 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:52:08 crc kubenswrapper[4834]: E1126 12:52:08.418453 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:52:08 crc kubenswrapper[4834]: I1126 12:52:08.427756 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93c6d5e-760a-4fcf-881f-8132bc217c3d" path="/var/lib/kubelet/pods/f93c6d5e-760a-4fcf-881f-8132bc217c3d/volumes" Nov 26 12:52:08 crc kubenswrapper[4834]: I1126 12:52:08.563684 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerStarted","Data":"fa5b3d58f91c78097d9a744e1affc4947f7c9d6c628cb8896c00576093de5dc2"} Nov 26 12:52:10 crc kubenswrapper[4834]: I1126 12:52:10.581293 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerStarted","Data":"31be80683d200e28706e02ba24aa65f0dc857294989098ec5bf682a501fe0d66"} Nov 26 12:52:10 crc kubenswrapper[4834]: I1126 12:52:10.581479 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="ceilometer-central-agent" containerID="cri-o://0bf0ce18f609352cdcb05a7277e4cdd915c3b5f612786d8e3af1dd560a18b1bb" gracePeriod=30 Nov 26 12:52:10 crc kubenswrapper[4834]: I1126 12:52:10.581741 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 12:52:10 crc kubenswrapper[4834]: I1126 12:52:10.581523 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="sg-core" containerID="cri-o://fa5b3d58f91c78097d9a744e1affc4947f7c9d6c628cb8896c00576093de5dc2" gracePeriod=30 Nov 26 12:52:10 crc kubenswrapper[4834]: I1126 12:52:10.581573 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="ceilometer-notification-agent" containerID="cri-o://e11c2b84a2fb06454689e0703d3202f9fb5259d0b2baba30522ab4b30f358df7" gracePeriod=30 Nov 26 12:52:10 crc kubenswrapper[4834]: I1126 12:52:10.581523 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="proxy-httpd" containerID="cri-o://31be80683d200e28706e02ba24aa65f0dc857294989098ec5bf682a501fe0d66" gracePeriod=30 Nov 26 12:52:10 crc kubenswrapper[4834]: I1126 12:52:10.604640 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.154369256 podStartE2EDuration="6.604625843s" podCreationTimestamp="2025-11-26 12:52:04 +0000 UTC" firstStartedPulling="2025-11-26 12:52:05.377279253 +0000 UTC m=+2423.284492605" lastFinishedPulling="2025-11-26 12:52:09.827535841 +0000 UTC m=+2427.734749192" observedRunningTime="2025-11-26 12:52:10.59968258 +0000 UTC m=+2428.506895921" watchObservedRunningTime="2025-11-26 12:52:10.604625843 +0000 UTC m=+2428.511839196" Nov 26 12:52:11 crc kubenswrapper[4834]: I1126 12:52:11.593424 4834 generic.go:334] "Generic (PLEG): container finished" podID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerID="31be80683d200e28706e02ba24aa65f0dc857294989098ec5bf682a501fe0d66" exitCode=0 Nov 26 12:52:11 crc kubenswrapper[4834]: I1126 12:52:11.593458 4834 generic.go:334] "Generic (PLEG): container finished" podID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerID="fa5b3d58f91c78097d9a744e1affc4947f7c9d6c628cb8896c00576093de5dc2" exitCode=2 Nov 26 12:52:11 crc kubenswrapper[4834]: I1126 12:52:11.593466 4834 generic.go:334] "Generic (PLEG): container finished" podID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerID="e11c2b84a2fb06454689e0703d3202f9fb5259d0b2baba30522ab4b30f358df7" exitCode=0 Nov 26 12:52:11 crc kubenswrapper[4834]: I1126 12:52:11.593488 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerDied","Data":"31be80683d200e28706e02ba24aa65f0dc857294989098ec5bf682a501fe0d66"} Nov 26 12:52:11 crc kubenswrapper[4834]: I1126 12:52:11.593515 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerDied","Data":"fa5b3d58f91c78097d9a744e1affc4947f7c9d6c628cb8896c00576093de5dc2"} Nov 26 12:52:11 crc kubenswrapper[4834]: I1126 12:52:11.593524 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerDied","Data":"e11c2b84a2fb06454689e0703d3202f9fb5259d0b2baba30522ab4b30f358df7"} Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.604406 4834 generic.go:334] "Generic (PLEG): container finished" podID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerID="0bf0ce18f609352cdcb05a7277e4cdd915c3b5f612786d8e3af1dd560a18b1bb" exitCode=0 Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.604540 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerDied","Data":"0bf0ce18f609352cdcb05a7277e4cdd915c3b5f612786d8e3af1dd560a18b1bb"} Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.910884 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.915083 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-scripts\") pod \"65398ed3-2d33-4693-8570-1d29e9c2ce52\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.915234 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-ceilometer-tls-certs\") pod \"65398ed3-2d33-4693-8570-1d29e9c2ce52\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.915285 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-combined-ca-bundle\") pod \"65398ed3-2d33-4693-8570-1d29e9c2ce52\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.915477 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-sg-core-conf-yaml\") pod \"65398ed3-2d33-4693-8570-1d29e9c2ce52\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.915537 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-run-httpd\") pod \"65398ed3-2d33-4693-8570-1d29e9c2ce52\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.915566 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-config-data\") pod \"65398ed3-2d33-4693-8570-1d29e9c2ce52\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.915636 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-log-httpd\") pod \"65398ed3-2d33-4693-8570-1d29e9c2ce52\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.915691 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn8xl\" (UniqueName: \"kubernetes.io/projected/65398ed3-2d33-4693-8570-1d29e9c2ce52-kube-api-access-xn8xl\") pod \"65398ed3-2d33-4693-8570-1d29e9c2ce52\" (UID: \"65398ed3-2d33-4693-8570-1d29e9c2ce52\") " Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.915919 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "65398ed3-2d33-4693-8570-1d29e9c2ce52" (UID: "65398ed3-2d33-4693-8570-1d29e9c2ce52"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.916115 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "65398ed3-2d33-4693-8570-1d29e9c2ce52" (UID: "65398ed3-2d33-4693-8570-1d29e9c2ce52"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.916635 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.916676 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65398ed3-2d33-4693-8570-1d29e9c2ce52-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.921183 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-scripts" (OuterVolumeSpecName: "scripts") pod "65398ed3-2d33-4693-8570-1d29e9c2ce52" (UID: "65398ed3-2d33-4693-8570-1d29e9c2ce52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.922492 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65398ed3-2d33-4693-8570-1d29e9c2ce52-kube-api-access-xn8xl" (OuterVolumeSpecName: "kube-api-access-xn8xl") pod "65398ed3-2d33-4693-8570-1d29e9c2ce52" (UID: "65398ed3-2d33-4693-8570-1d29e9c2ce52"). InnerVolumeSpecName "kube-api-access-xn8xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.946172 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "65398ed3-2d33-4693-8570-1d29e9c2ce52" (UID: "65398ed3-2d33-4693-8570-1d29e9c2ce52"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.969391 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "65398ed3-2d33-4693-8570-1d29e9c2ce52" (UID: "65398ed3-2d33-4693-8570-1d29e9c2ce52"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:12 crc kubenswrapper[4834]: I1126 12:52:12.993648 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65398ed3-2d33-4693-8570-1d29e9c2ce52" (UID: "65398ed3-2d33-4693-8570-1d29e9c2ce52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.003177 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-config-data" (OuterVolumeSpecName: "config-data") pod "65398ed3-2d33-4693-8570-1d29e9c2ce52" (UID: "65398ed3-2d33-4693-8570-1d29e9c2ce52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.019917 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.019956 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.019971 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn8xl\" (UniqueName: \"kubernetes.io/projected/65398ed3-2d33-4693-8570-1d29e9c2ce52-kube-api-access-xn8xl\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.019987 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.019999 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.020011 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65398ed3-2d33-4693-8570-1d29e9c2ce52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.618816 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65398ed3-2d33-4693-8570-1d29e9c2ce52","Type":"ContainerDied","Data":"1f70ccd99b49b1cd1a2d93c8ca9c0b9064508ba9898653af592486370f190fbf"} Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.618941 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.619154 4834 scope.go:117] "RemoveContainer" containerID="31be80683d200e28706e02ba24aa65f0dc857294989098ec5bf682a501fe0d66" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.637291 4834 scope.go:117] "RemoveContainer" containerID="fa5b3d58f91c78097d9a744e1affc4947f7c9d6c628cb8896c00576093de5dc2" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.675011 4834 scope.go:117] "RemoveContainer" containerID="e11c2b84a2fb06454689e0703d3202f9fb5259d0b2baba30522ab4b30f358df7" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.691546 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.693354 4834 scope.go:117] "RemoveContainer" containerID="0bf0ce18f609352cdcb05a7277e4cdd915c3b5f612786d8e3af1dd560a18b1bb" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.705284 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.712702 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:13 crc kubenswrapper[4834]: E1126 12:52:13.713079 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="sg-core" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713096 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="sg-core" Nov 26 12:52:13 crc kubenswrapper[4834]: E1126 12:52:13.713110 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93c6d5e-760a-4fcf-881f-8132bc217c3d" containerName="dnsmasq-dns" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713118 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93c6d5e-760a-4fcf-881f-8132bc217c3d" containerName="dnsmasq-dns" Nov 26 12:52:13 crc kubenswrapper[4834]: E1126 12:52:13.713130 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="ceilometer-central-agent" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713136 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="ceilometer-central-agent" Nov 26 12:52:13 crc kubenswrapper[4834]: E1126 12:52:13.713154 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="proxy-httpd" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713159 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="proxy-httpd" Nov 26 12:52:13 crc kubenswrapper[4834]: E1126 12:52:13.713171 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93c6d5e-760a-4fcf-881f-8132bc217c3d" containerName="init" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713176 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93c6d5e-760a-4fcf-881f-8132bc217c3d" containerName="init" Nov 26 12:52:13 crc kubenswrapper[4834]: E1126 12:52:13.713199 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="ceilometer-notification-agent" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713206 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="ceilometer-notification-agent" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713395 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="ceilometer-notification-agent" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713408 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="ceilometer-central-agent" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713424 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93c6d5e-760a-4fcf-881f-8132bc217c3d" containerName="dnsmasq-dns" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713435 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="sg-core" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.713445 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" containerName="proxy-httpd" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.714954 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.716648 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.716788 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.717146 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.725302 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.735916 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-config-data\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.735967 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca87460-b55b-4744-af3d-37a0a9662784-log-httpd\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.736117 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.736262 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-scripts\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.736292 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.736612 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.736659 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca87460-b55b-4744-af3d-37a0a9662784-run-httpd\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.736733 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65bln\" (UniqueName: \"kubernetes.io/projected/7ca87460-b55b-4744-af3d-37a0a9662784-kube-api-access-65bln\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.837302 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca87460-b55b-4744-af3d-37a0a9662784-run-httpd\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.837399 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65bln\" (UniqueName: \"kubernetes.io/projected/7ca87460-b55b-4744-af3d-37a0a9662784-kube-api-access-65bln\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.837447 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-config-data\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.837471 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca87460-b55b-4744-af3d-37a0a9662784-log-httpd\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.837490 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.837513 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.837529 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-scripts\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.837611 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.838669 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca87460-b55b-4744-af3d-37a0a9662784-log-httpd\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.839244 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca87460-b55b-4744-af3d-37a0a9662784-run-httpd\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.842288 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.842932 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.844978 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-config-data\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.848882 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.852416 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca87460-b55b-4744-af3d-37a0a9662784-scripts\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:13 crc kubenswrapper[4834]: I1126 12:52:13.854665 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65bln\" (UniqueName: \"kubernetes.io/projected/7ca87460-b55b-4744-af3d-37a0a9662784-kube-api-access-65bln\") pod \"ceilometer-0\" (UID: \"7ca87460-b55b-4744-af3d-37a0a9662784\") " pod="openstack/ceilometer-0" Nov 26 12:52:14 crc kubenswrapper[4834]: I1126 12:52:14.028806 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 12:52:14 crc kubenswrapper[4834]: I1126 12:52:14.426720 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65398ed3-2d33-4693-8570-1d29e9c2ce52" path="/var/lib/kubelet/pods/65398ed3-2d33-4693-8570-1d29e9c2ce52/volumes" Nov 26 12:52:14 crc kubenswrapper[4834]: W1126 12:52:14.453523 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ca87460_b55b_4744_af3d_37a0a9662784.slice/crio-28a1f5198cd1fc0f04fb7c3b593e3b6489e7350177225c99d74358bc61a8692a WatchSource:0}: Error finding container 28a1f5198cd1fc0f04fb7c3b593e3b6489e7350177225c99d74358bc61a8692a: Status 404 returned error can't find the container with id 28a1f5198cd1fc0f04fb7c3b593e3b6489e7350177225c99d74358bc61a8692a Nov 26 12:52:14 crc kubenswrapper[4834]: I1126 12:52:14.455217 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 12:52:14 crc kubenswrapper[4834]: I1126 12:52:14.628144 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca87460-b55b-4744-af3d-37a0a9662784","Type":"ContainerStarted","Data":"28a1f5198cd1fc0f04fb7c3b593e3b6489e7350177225c99d74358bc61a8692a"} Nov 26 12:52:15 crc kubenswrapper[4834]: I1126 12:52:15.638148 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca87460-b55b-4744-af3d-37a0a9662784","Type":"ContainerStarted","Data":"2e64a7ab0c53a6fee41776d10d343c87299c296ebec85421baa4ac80695ba0e0"} Nov 26 12:52:16 crc kubenswrapper[4834]: I1126 12:52:16.149415 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 26 12:52:16 crc kubenswrapper[4834]: I1126 12:52:16.173052 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 26 12:52:16 crc kubenswrapper[4834]: I1126 12:52:16.216830 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 12:52:16 crc kubenswrapper[4834]: I1126 12:52:16.255690 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 12:52:16 crc kubenswrapper[4834]: I1126 12:52:16.648979 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca87460-b55b-4744-af3d-37a0a9662784","Type":"ContainerStarted","Data":"d5b0d4432a8d82eb9b3e2e3a9384059187d6f9fb311f78002856fd943b816dd9"} Nov 26 12:52:16 crc kubenswrapper[4834]: I1126 12:52:16.649518 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerName="probe" containerID="cri-o://ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26" gracePeriod=30 Nov 26 12:52:16 crc kubenswrapper[4834]: I1126 12:52:16.649667 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerName="manila-share" containerID="cri-o://e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24" gracePeriod=30 Nov 26 12:52:16 crc kubenswrapper[4834]: I1126 12:52:16.649817 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="7ec328cd-60df-4d12-96be-5ed62485724b" containerName="manila-scheduler" containerID="cri-o://37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20" gracePeriod=30 Nov 26 12:52:16 crc kubenswrapper[4834]: I1126 12:52:16.649876 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="7ec328cd-60df-4d12-96be-5ed62485724b" containerName="probe" containerID="cri-o://e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef" gracePeriod=30 Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.469949 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.638238 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-ceph\") pod \"9e538d8f-eaf6-4f97-a773-1d965586a83d\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.638841 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-etc-machine-id\") pod \"9e538d8f-eaf6-4f97-a773-1d965586a83d\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.638944 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9e538d8f-eaf6-4f97-a773-1d965586a83d" (UID: "9e538d8f-eaf6-4f97-a773-1d965586a83d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.638994 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-scripts\") pod \"9e538d8f-eaf6-4f97-a773-1d965586a83d\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.639196 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-combined-ca-bundle\") pod \"9e538d8f-eaf6-4f97-a773-1d965586a83d\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.639322 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data-custom\") pod \"9e538d8f-eaf6-4f97-a773-1d965586a83d\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.639380 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data\") pod \"9e538d8f-eaf6-4f97-a773-1d965586a83d\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.639467 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-var-lib-manila\") pod \"9e538d8f-eaf6-4f97-a773-1d965586a83d\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.639489 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqzpx\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-kube-api-access-zqzpx\") pod \"9e538d8f-eaf6-4f97-a773-1d965586a83d\" (UID: \"9e538d8f-eaf6-4f97-a773-1d965586a83d\") " Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.641161 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.645394 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "9e538d8f-eaf6-4f97-a773-1d965586a83d" (UID: "9e538d8f-eaf6-4f97-a773-1d965586a83d"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.645581 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-kube-api-access-zqzpx" (OuterVolumeSpecName: "kube-api-access-zqzpx") pod "9e538d8f-eaf6-4f97-a773-1d965586a83d" (UID: "9e538d8f-eaf6-4f97-a773-1d965586a83d"). InnerVolumeSpecName "kube-api-access-zqzpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.646150 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-scripts" (OuterVolumeSpecName: "scripts") pod "9e538d8f-eaf6-4f97-a773-1d965586a83d" (UID: "9e538d8f-eaf6-4f97-a773-1d965586a83d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.646386 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9e538d8f-eaf6-4f97-a773-1d965586a83d" (UID: "9e538d8f-eaf6-4f97-a773-1d965586a83d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.650118 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-ceph" (OuterVolumeSpecName: "ceph") pod "9e538d8f-eaf6-4f97-a773-1d965586a83d" (UID: "9e538d8f-eaf6-4f97-a773-1d965586a83d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.666864 4834 generic.go:334] "Generic (PLEG): container finished" podID="7ec328cd-60df-4d12-96be-5ed62485724b" containerID="e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef" exitCode=0 Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.666939 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7ec328cd-60df-4d12-96be-5ed62485724b","Type":"ContainerDied","Data":"e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef"} Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.676425 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca87460-b55b-4744-af3d-37a0a9662784","Type":"ContainerStarted","Data":"ded322f35f70c11987d5fa589156692b95a3c1581ecdc4a67607cccc74a241cd"} Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.679337 4834 generic.go:334] "Generic (PLEG): container finished" podID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerID="ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26" exitCode=0 Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.679359 4834 generic.go:334] "Generic (PLEG): container finished" podID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerID="e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24" exitCode=1 Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.679372 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9e538d8f-eaf6-4f97-a773-1d965586a83d","Type":"ContainerDied","Data":"ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26"} Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.679399 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9e538d8f-eaf6-4f97-a773-1d965586a83d","Type":"ContainerDied","Data":"e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24"} Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.679410 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9e538d8f-eaf6-4f97-a773-1d965586a83d","Type":"ContainerDied","Data":"67f78632d7fb7330f59f449a8c4b7debbd39e5940ad4cc9b02b03ac8320d5ec7"} Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.679425 4834 scope.go:117] "RemoveContainer" containerID="ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.679429 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.693206 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e538d8f-eaf6-4f97-a773-1d965586a83d" (UID: "9e538d8f-eaf6-4f97-a773-1d965586a83d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.706560 4834 scope.go:117] "RemoveContainer" containerID="e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.735258 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data" (OuterVolumeSpecName: "config-data") pod "9e538d8f-eaf6-4f97-a773-1d965586a83d" (UID: "9e538d8f-eaf6-4f97-a773-1d965586a83d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.735282 4834 scope.go:117] "RemoveContainer" containerID="ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26" Nov 26 12:52:17 crc kubenswrapper[4834]: E1126 12:52:17.735753 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26\": container with ID starting with ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26 not found: ID does not exist" containerID="ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.735793 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26"} err="failed to get container status \"ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26\": rpc error: code = NotFound desc = could not find container \"ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26\": container with ID starting with ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26 not found: ID does not exist" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.735821 4834 scope.go:117] "RemoveContainer" containerID="e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24" Nov 26 12:52:17 crc kubenswrapper[4834]: E1126 12:52:17.736108 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24\": container with ID starting with e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24 not found: ID does not exist" containerID="e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.736148 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24"} err="failed to get container status \"e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24\": rpc error: code = NotFound desc = could not find container \"e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24\": container with ID starting with e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24 not found: ID does not exist" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.736173 4834 scope.go:117] "RemoveContainer" containerID="ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.736616 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26"} err="failed to get container status \"ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26\": rpc error: code = NotFound desc = could not find container \"ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26\": container with ID starting with ee658f47b191b9edaa319c3d24e34561b4ded174f171c08701bbe4e821de3b26 not found: ID does not exist" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.736649 4834 scope.go:117] "RemoveContainer" containerID="e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.736983 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24"} err="failed to get container status \"e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24\": rpc error: code = NotFound desc = could not find container \"e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24\": container with ID starting with e08e8f8c6ecb595bf0f6ad759dfe8e5a2a364b7ca74857b6efdff64cd6aa0a24 not found: ID does not exist" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.743737 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.743766 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.743778 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.743788 4834 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9e538d8f-eaf6-4f97-a773-1d965586a83d-var-lib-manila\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.743797 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqzpx\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-kube-api-access-zqzpx\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.743809 4834 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9e538d8f-eaf6-4f97-a773-1d965586a83d-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:17 crc kubenswrapper[4834]: I1126 12:52:17.743818 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e538d8f-eaf6-4f97-a773-1d965586a83d-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.006921 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.013279 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.026167 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 12:52:18 crc kubenswrapper[4834]: E1126 12:52:18.027197 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerName="probe" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.027222 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerName="probe" Nov 26 12:52:18 crc kubenswrapper[4834]: E1126 12:52:18.027240 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerName="manila-share" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.027247 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerName="manila-share" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.027464 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerName="probe" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.027476 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e538d8f-eaf6-4f97-a773-1d965586a83d" containerName="manila-share" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.028509 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.032112 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.035429 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.053176 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.053295 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.053357 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8jtv\" (UniqueName: \"kubernetes.io/projected/629fb482-78c1-4e80-96c8-bfd0ec24993d-kube-api-access-g8jtv\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.053421 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-config-data\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.053438 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/629fb482-78c1-4e80-96c8-bfd0ec24993d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.053562 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/629fb482-78c1-4e80-96c8-bfd0ec24993d-ceph\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.053686 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-scripts\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.053854 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/629fb482-78c1-4e80-96c8-bfd0ec24993d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155154 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/629fb482-78c1-4e80-96c8-bfd0ec24993d-ceph\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155508 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-scripts\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155581 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/629fb482-78c1-4e80-96c8-bfd0ec24993d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155663 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155690 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/629fb482-78c1-4e80-96c8-bfd0ec24993d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155760 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155786 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8jtv\" (UniqueName: \"kubernetes.io/projected/629fb482-78c1-4e80-96c8-bfd0ec24993d-kube-api-access-g8jtv\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155838 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-config-data\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155854 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/629fb482-78c1-4e80-96c8-bfd0ec24993d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.155953 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/629fb482-78c1-4e80-96c8-bfd0ec24993d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.159083 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/629fb482-78c1-4e80-96c8-bfd0ec24993d-ceph\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.159215 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.159744 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-config-data\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.160372 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.160777 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/629fb482-78c1-4e80-96c8-bfd0ec24993d-scripts\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.171657 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8jtv\" (UniqueName: \"kubernetes.io/projected/629fb482-78c1-4e80-96c8-bfd0ec24993d-kube-api-access-g8jtv\") pod \"manila-share-share1-0\" (UID: \"629fb482-78c1-4e80-96c8-bfd0ec24993d\") " pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.348614 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.428124 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e538d8f-eaf6-4f97-a773-1d965586a83d" path="/var/lib/kubelet/pods/9e538d8f-eaf6-4f97-a773-1d965586a83d/volumes" Nov 26 12:52:18 crc kubenswrapper[4834]: I1126 12:52:18.854886 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.544650 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.593935 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data\") pod \"7ec328cd-60df-4d12-96be-5ed62485724b\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.594103 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-combined-ca-bundle\") pod \"7ec328cd-60df-4d12-96be-5ed62485724b\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.594131 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec328cd-60df-4d12-96be-5ed62485724b-etc-machine-id\") pod \"7ec328cd-60df-4d12-96be-5ed62485724b\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.594235 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data-custom\") pod \"7ec328cd-60df-4d12-96be-5ed62485724b\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.594261 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-scripts\") pod \"7ec328cd-60df-4d12-96be-5ed62485724b\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.594284 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfbbb\" (UniqueName: \"kubernetes.io/projected/7ec328cd-60df-4d12-96be-5ed62485724b-kube-api-access-dfbbb\") pod \"7ec328cd-60df-4d12-96be-5ed62485724b\" (UID: \"7ec328cd-60df-4d12-96be-5ed62485724b\") " Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.594403 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ec328cd-60df-4d12-96be-5ed62485724b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ec328cd-60df-4d12-96be-5ed62485724b" (UID: "7ec328cd-60df-4d12-96be-5ed62485724b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.594699 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec328cd-60df-4d12-96be-5ed62485724b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.598967 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ec328cd-60df-4d12-96be-5ed62485724b" (UID: "7ec328cd-60df-4d12-96be-5ed62485724b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.603988 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-scripts" (OuterVolumeSpecName: "scripts") pod "7ec328cd-60df-4d12-96be-5ed62485724b" (UID: "7ec328cd-60df-4d12-96be-5ed62485724b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.605050 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec328cd-60df-4d12-96be-5ed62485724b-kube-api-access-dfbbb" (OuterVolumeSpecName: "kube-api-access-dfbbb") pod "7ec328cd-60df-4d12-96be-5ed62485724b" (UID: "7ec328cd-60df-4d12-96be-5ed62485724b"). InnerVolumeSpecName "kube-api-access-dfbbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.659743 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ec328cd-60df-4d12-96be-5ed62485724b" (UID: "7ec328cd-60df-4d12-96be-5ed62485724b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.696982 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.697021 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.697091 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfbbb\" (UniqueName: \"kubernetes.io/projected/7ec328cd-60df-4d12-96be-5ed62485724b-kube-api-access-dfbbb\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.697857 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.707801 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data" (OuterVolumeSpecName: "config-data") pod "7ec328cd-60df-4d12-96be-5ed62485724b" (UID: "7ec328cd-60df-4d12-96be-5ed62485724b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.707913 4834 generic.go:334] "Generic (PLEG): container finished" podID="7ec328cd-60df-4d12-96be-5ed62485724b" containerID="37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20" exitCode=0 Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.708027 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.708052 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7ec328cd-60df-4d12-96be-5ed62485724b","Type":"ContainerDied","Data":"37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20"} Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.708091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7ec328cd-60df-4d12-96be-5ed62485724b","Type":"ContainerDied","Data":"97d83aac19ffb24e76465a8d144e24200544a635193f1aa1d93f7ec040f4712f"} Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.708110 4834 scope.go:117] "RemoveContainer" containerID="e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.715511 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca87460-b55b-4744-af3d-37a0a9662784","Type":"ContainerStarted","Data":"d67dc0e6e40548ab2535ee5257fad441c23c86676b7fb28ac0b0841a13cd1916"} Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.715814 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.718651 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"629fb482-78c1-4e80-96c8-bfd0ec24993d","Type":"ContainerStarted","Data":"86365dde82d56d5510af46e179dfd669a9dcfd7c7963f463f1e0e56cc63c3c76"} Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.718706 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"629fb482-78c1-4e80-96c8-bfd0ec24993d","Type":"ContainerStarted","Data":"1cde59d0c9a1f41166dd2f3802bf86d6337479cff55ae825e15aa0f0d3f351da"} Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.740121 4834 scope.go:117] "RemoveContainer" containerID="37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.756441 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.079621284 podStartE2EDuration="6.756426345s" podCreationTimestamp="2025-11-26 12:52:13 +0000 UTC" firstStartedPulling="2025-11-26 12:52:14.457362752 +0000 UTC m=+2432.364576104" lastFinishedPulling="2025-11-26 12:52:19.134167813 +0000 UTC m=+2437.041381165" observedRunningTime="2025-11-26 12:52:19.735826829 +0000 UTC m=+2437.643040182" watchObservedRunningTime="2025-11-26 12:52:19.756426345 +0000 UTC m=+2437.663639697" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.768577 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.780428 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.786708 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 12:52:19 crc kubenswrapper[4834]: E1126 12:52:19.787361 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec328cd-60df-4d12-96be-5ed62485724b" containerName="manila-scheduler" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.787398 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec328cd-60df-4d12-96be-5ed62485724b" containerName="manila-scheduler" Nov 26 12:52:19 crc kubenswrapper[4834]: E1126 12:52:19.787422 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec328cd-60df-4d12-96be-5ed62485724b" containerName="probe" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.787428 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec328cd-60df-4d12-96be-5ed62485724b" containerName="probe" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.787913 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec328cd-60df-4d12-96be-5ed62485724b" containerName="manila-scheduler" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.787941 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec328cd-60df-4d12-96be-5ed62485724b" containerName="probe" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.789196 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.790779 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.799514 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckpgc\" (UniqueName: \"kubernetes.io/projected/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-kube-api-access-ckpgc\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.799553 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-scripts\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.799715 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.799800 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-config-data\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.799927 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.800013 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.800186 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec328cd-60df-4d12-96be-5ed62485724b-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.801475 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.804591 4834 scope.go:117] "RemoveContainer" containerID="e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef" Nov 26 12:52:19 crc kubenswrapper[4834]: E1126 12:52:19.805541 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef\": container with ID starting with e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef not found: ID does not exist" containerID="e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.805577 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef"} err="failed to get container status \"e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef\": rpc error: code = NotFound desc = could not find container \"e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef\": container with ID starting with e68bcea681e0a85e253083bcdbb5455b34d5089fdf2523bae4d5e60a8eb66eef not found: ID does not exist" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.805605 4834 scope.go:117] "RemoveContainer" containerID="37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20" Nov 26 12:52:19 crc kubenswrapper[4834]: E1126 12:52:19.807039 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20\": container with ID starting with 37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20 not found: ID does not exist" containerID="37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.807078 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20"} err="failed to get container status \"37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20\": rpc error: code = NotFound desc = could not find container \"37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20\": container with ID starting with 37f6190e6c71e8837155277d26ab5606d5c8b8031cedfc8a3de05011d2742a20 not found: ID does not exist" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.903031 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckpgc\" (UniqueName: \"kubernetes.io/projected/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-kube-api-access-ckpgc\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.903102 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-scripts\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.903278 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.904086 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-config-data\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.904104 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.904366 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.904490 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.908477 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.909183 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-config-data\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.909972 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.911591 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-scripts\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:19 crc kubenswrapper[4834]: I1126 12:52:19.918054 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckpgc\" (UniqueName: \"kubernetes.io/projected/8e0a711b-a1c2-474b-87fb-729aaf8d00d7-kube-api-access-ckpgc\") pod \"manila-scheduler-0\" (UID: \"8e0a711b-a1c2-474b-87fb-729aaf8d00d7\") " pod="openstack/manila-scheduler-0" Nov 26 12:52:20 crc kubenswrapper[4834]: I1126 12:52:20.134726 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 26 12:52:20 crc kubenswrapper[4834]: I1126 12:52:20.437520 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec328cd-60df-4d12-96be-5ed62485724b" path="/var/lib/kubelet/pods/7ec328cd-60df-4d12-96be-5ed62485724b/volumes" Nov 26 12:52:20 crc kubenswrapper[4834]: I1126 12:52:20.606034 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 12:52:20 crc kubenswrapper[4834]: W1126 12:52:20.613424 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e0a711b_a1c2_474b_87fb_729aaf8d00d7.slice/crio-3692f92bdcc2fb6ebe8cf1d19bd0236e6b637eae60317f9066116109d1aff219 WatchSource:0}: Error finding container 3692f92bdcc2fb6ebe8cf1d19bd0236e6b637eae60317f9066116109d1aff219: Status 404 returned error can't find the container with id 3692f92bdcc2fb6ebe8cf1d19bd0236e6b637eae60317f9066116109d1aff219 Nov 26 12:52:20 crc kubenswrapper[4834]: I1126 12:52:20.732093 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8e0a711b-a1c2-474b-87fb-729aaf8d00d7","Type":"ContainerStarted","Data":"3692f92bdcc2fb6ebe8cf1d19bd0236e6b637eae60317f9066116109d1aff219"} Nov 26 12:52:20 crc kubenswrapper[4834]: I1126 12:52:20.745143 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"629fb482-78c1-4e80-96c8-bfd0ec24993d","Type":"ContainerStarted","Data":"3f590e5eef9632c499c5be156b1677eb3fe3f6eda55935cd6d292c2250923598"} Nov 26 12:52:20 crc kubenswrapper[4834]: I1126 12:52:20.773102 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.773082545 podStartE2EDuration="2.773082545s" podCreationTimestamp="2025-11-26 12:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:52:20.764849255 +0000 UTC m=+2438.672062607" watchObservedRunningTime="2025-11-26 12:52:20.773082545 +0000 UTC m=+2438.680295897" Nov 26 12:52:21 crc kubenswrapper[4834]: I1126 12:52:21.021231 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Nov 26 12:52:21 crc kubenswrapper[4834]: I1126 12:52:21.417612 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:52:21 crc kubenswrapper[4834]: E1126 12:52:21.418074 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:52:21 crc kubenswrapper[4834]: I1126 12:52:21.755376 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8e0a711b-a1c2-474b-87fb-729aaf8d00d7","Type":"ContainerStarted","Data":"7b971b0fde143de7eccc182b7b90b25e41c354092708336003f708f510a0950e"} Nov 26 12:52:21 crc kubenswrapper[4834]: I1126 12:52:21.755423 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8e0a711b-a1c2-474b-87fb-729aaf8d00d7","Type":"ContainerStarted","Data":"3c990c92608691ad742661767c538dfc91bc72131811c6096cec8d07c2dac5cc"} Nov 26 12:52:21 crc kubenswrapper[4834]: I1126 12:52:21.776745 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.776727834 podStartE2EDuration="2.776727834s" podCreationTimestamp="2025-11-26 12:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:52:21.768542966 +0000 UTC m=+2439.675756318" watchObservedRunningTime="2025-11-26 12:52:21.776727834 +0000 UTC m=+2439.683941187" Nov 26 12:52:28 crc kubenswrapper[4834]: I1126 12:52:28.348726 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 26 12:52:30 crc kubenswrapper[4834]: I1126 12:52:30.135005 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 26 12:52:33 crc kubenswrapper[4834]: I1126 12:52:33.416731 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:52:33 crc kubenswrapper[4834]: E1126 12:52:33.417395 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:52:39 crc kubenswrapper[4834]: I1126 12:52:39.563244 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 26 12:52:41 crc kubenswrapper[4834]: I1126 12:52:41.324542 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 26 12:52:44 crc kubenswrapper[4834]: I1126 12:52:44.039707 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 12:52:46 crc kubenswrapper[4834]: I1126 12:52:46.422947 4834 scope.go:117] "RemoveContainer" containerID="09cad6b18fe8864a06876896a5d12bb81658c3f8cf28586701f9f600852ae922" Nov 26 12:52:46 crc kubenswrapper[4834]: I1126 12:52:46.445102 4834 scope.go:117] "RemoveContainer" containerID="bb910519d02a4fb561360c59bf461b5e6c9ac9404ecea01044b038c46368c939" Nov 26 12:52:48 crc kubenswrapper[4834]: I1126 12:52:48.417737 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:52:48 crc kubenswrapper[4834]: E1126 12:52:48.418599 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:52:59 crc kubenswrapper[4834]: I1126 12:52:59.418432 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:53:00 crc kubenswrapper[4834]: I1126 12:53:00.009057 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"9fd522d7710ec6a6cae7b65384302f26fcdba1cdfb96e53555c07816508c637d"} Nov 26 12:53:32 crc kubenswrapper[4834]: I1126 12:53:32.362390 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z"] Nov 26 12:53:32 crc kubenswrapper[4834]: I1126 12:53:32.363815 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" Nov 26 12:53:32 crc kubenswrapper[4834]: I1126 12:53:32.381211 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z"] Nov 26 12:53:32 crc kubenswrapper[4834]: I1126 12:53:32.548059 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kddc\" (UniqueName: \"kubernetes.io/projected/8267804e-bacd-49ad-a0b2-0168e5d6be37-kube-api-access-8kddc\") pod \"openstack-operator-controller-operator-598cfbdbc-hvk9z\" (UID: \"8267804e-bacd-49ad-a0b2-0168e5d6be37\") " pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" Nov 26 12:53:32 crc kubenswrapper[4834]: I1126 12:53:32.650586 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kddc\" (UniqueName: \"kubernetes.io/projected/8267804e-bacd-49ad-a0b2-0168e5d6be37-kube-api-access-8kddc\") pod \"openstack-operator-controller-operator-598cfbdbc-hvk9z\" (UID: \"8267804e-bacd-49ad-a0b2-0168e5d6be37\") " pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" Nov 26 12:53:32 crc kubenswrapper[4834]: I1126 12:53:32.665955 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kddc\" (UniqueName: \"kubernetes.io/projected/8267804e-bacd-49ad-a0b2-0168e5d6be37-kube-api-access-8kddc\") pod \"openstack-operator-controller-operator-598cfbdbc-hvk9z\" (UID: \"8267804e-bacd-49ad-a0b2-0168e5d6be37\") " pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" Nov 26 12:53:32 crc kubenswrapper[4834]: I1126 12:53:32.685614 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" Nov 26 12:53:33 crc kubenswrapper[4834]: I1126 12:53:33.066132 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z"] Nov 26 12:53:33 crc kubenswrapper[4834]: I1126 12:53:33.223757 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" event={"ID":"8267804e-bacd-49ad-a0b2-0168e5d6be37","Type":"ContainerStarted","Data":"35f2ec7bb0f5e0de884b4678b9192d7b9a7e217bdf24a1c002500086fdc8f8f3"} Nov 26 12:53:33 crc kubenswrapper[4834]: I1126 12:53:33.223799 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" event={"ID":"8267804e-bacd-49ad-a0b2-0168e5d6be37","Type":"ContainerStarted","Data":"a1f6038cc2dfa171bb1aed1f3c17452f44cf6c8ce564fa03206f2eb588b4150a"} Nov 26 12:53:33 crc kubenswrapper[4834]: I1126 12:53:33.223909 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" Nov 26 12:53:33 crc kubenswrapper[4834]: I1126 12:53:33.244605 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" podStartSLOduration=1.244592276 podStartE2EDuration="1.244592276s" podCreationTimestamp="2025-11-26 12:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 12:53:33.241947858 +0000 UTC m=+2511.149161210" watchObservedRunningTime="2025-11-26 12:53:33.244592276 +0000 UTC m=+2511.151805627" Nov 26 12:53:42 crc kubenswrapper[4834]: I1126 12:53:42.688759 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-598cfbdbc-hvk9z" Nov 26 12:53:42 crc kubenswrapper[4834]: I1126 12:53:42.744271 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2"] Nov 26 12:53:42 crc kubenswrapper[4834]: I1126 12:53:42.744478 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" podUID="28ea2d78-52e8-4081-9324-3b3c7acb0c34" containerName="operator" containerID="cri-o://c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5" gracePeriod=10 Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.111858 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.232571 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmt76\" (UniqueName: \"kubernetes.io/projected/28ea2d78-52e8-4081-9324-3b3c7acb0c34-kube-api-access-rmt76\") pod \"28ea2d78-52e8-4081-9324-3b3c7acb0c34\" (UID: \"28ea2d78-52e8-4081-9324-3b3c7acb0c34\") " Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.236997 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ea2d78-52e8-4081-9324-3b3c7acb0c34-kube-api-access-rmt76" (OuterVolumeSpecName: "kube-api-access-rmt76") pod "28ea2d78-52e8-4081-9324-3b3c7acb0c34" (UID: "28ea2d78-52e8-4081-9324-3b3c7acb0c34"). InnerVolumeSpecName "kube-api-access-rmt76". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.289912 4834 generic.go:334] "Generic (PLEG): container finished" podID="28ea2d78-52e8-4081-9324-3b3c7acb0c34" containerID="c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5" exitCode=0 Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.289962 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.289971 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" event={"ID":"28ea2d78-52e8-4081-9324-3b3c7acb0c34","Type":"ContainerDied","Data":"c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5"} Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.290009 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2" event={"ID":"28ea2d78-52e8-4081-9324-3b3c7acb0c34","Type":"ContainerDied","Data":"152bd366bfd49654fc0220cb329a454819cdb7f7c3fcc8426bc1a8207a949d3d"} Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.290029 4834 scope.go:117] "RemoveContainer" containerID="c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5" Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.309887 4834 scope.go:117] "RemoveContainer" containerID="c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5" Nov 26 12:53:43 crc kubenswrapper[4834]: E1126 12:53:43.313781 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5\": container with ID starting with c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5 not found: ID does not exist" containerID="c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5" Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.313824 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5"} err="failed to get container status \"c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5\": rpc error: code = NotFound desc = could not find container \"c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5\": container with ID starting with c3461ee220afc4da943a6e1fded6b410c16a338e610f630ff41c60570f59e7c5 not found: ID does not exist" Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.316245 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2"] Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.321264 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-544fb75865-94vd2"] Nov 26 12:53:43 crc kubenswrapper[4834]: I1126 12:53:43.335293 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmt76\" (UniqueName: \"kubernetes.io/projected/28ea2d78-52e8-4081-9324-3b3c7acb0c34-kube-api-access-rmt76\") on node \"crc\" DevicePath \"\"" Nov 26 12:53:44 crc kubenswrapper[4834]: I1126 12:53:44.427795 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ea2d78-52e8-4081-9324-3b3c7acb0c34" path="/var/lib/kubelet/pods/28ea2d78-52e8-4081-9324-3b3c7acb0c34/volumes" Nov 26 12:54:18 crc kubenswrapper[4834]: I1126 12:54:18.588331 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d"] Nov 26 12:54:18 crc kubenswrapper[4834]: E1126 12:54:18.589361 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ea2d78-52e8-4081-9324-3b3c7acb0c34" containerName="operator" Nov 26 12:54:18 crc kubenswrapper[4834]: I1126 12:54:18.589519 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ea2d78-52e8-4081-9324-3b3c7acb0c34" containerName="operator" Nov 26 12:54:18 crc kubenswrapper[4834]: I1126 12:54:18.589737 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ea2d78-52e8-4081-9324-3b3c7acb0c34" containerName="operator" Nov 26 12:54:18 crc kubenswrapper[4834]: I1126 12:54:18.590800 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" Nov 26 12:54:18 crc kubenswrapper[4834]: I1126 12:54:18.599115 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d"] Nov 26 12:54:18 crc kubenswrapper[4834]: I1126 12:54:18.773276 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm75x\" (UniqueName: \"kubernetes.io/projected/8c0416e2-355c-4b19-b275-10d029919025-kube-api-access-wm75x\") pod \"test-operator-controller-manager-6f76cf5bc4-m745d\" (UID: \"8c0416e2-355c-4b19-b275-10d029919025\") " pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" Nov 26 12:54:18 crc kubenswrapper[4834]: I1126 12:54:18.874233 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm75x\" (UniqueName: \"kubernetes.io/projected/8c0416e2-355c-4b19-b275-10d029919025-kube-api-access-wm75x\") pod \"test-operator-controller-manager-6f76cf5bc4-m745d\" (UID: \"8c0416e2-355c-4b19-b275-10d029919025\") " pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" Nov 26 12:54:18 crc kubenswrapper[4834]: I1126 12:54:18.897131 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm75x\" (UniqueName: \"kubernetes.io/projected/8c0416e2-355c-4b19-b275-10d029919025-kube-api-access-wm75x\") pod \"test-operator-controller-manager-6f76cf5bc4-m745d\" (UID: \"8c0416e2-355c-4b19-b275-10d029919025\") " pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" Nov 26 12:54:18 crc kubenswrapper[4834]: I1126 12:54:18.908791 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" Nov 26 12:54:19 crc kubenswrapper[4834]: I1126 12:54:19.334998 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d"] Nov 26 12:54:19 crc kubenswrapper[4834]: I1126 12:54:19.341550 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 12:54:19 crc kubenswrapper[4834]: I1126 12:54:19.558351 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" event={"ID":"8c0416e2-355c-4b19-b275-10d029919025","Type":"ContainerStarted","Data":"8d9f2b61b5f4cf5d52d7159b1356575ea2ce24a9c2c6fcb39ed1e9ed7071c90c"} Nov 26 12:55:21 crc kubenswrapper[4834]: I1126 12:55:21.530776 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:55:21 crc kubenswrapper[4834]: I1126 12:55:21.531176 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:55:51 crc kubenswrapper[4834]: I1126 12:55:51.531650 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:55:51 crc kubenswrapper[4834]: I1126 12:55:51.532033 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:56:19 crc kubenswrapper[4834]: E1126 12:56:19.346005 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" image="38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517" Nov 26 12:56:19 crc kubenswrapper[4834]: E1126 12:56:19.346416 4834 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" image="38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517" Nov 26 12:56:19 crc kubenswrapper[4834]: E1126 12:56:19.346571 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wm75x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6f76cf5bc4-m745d_openstack-operators(8c0416e2-355c-4b19-b275-10d029919025): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" logger="UnhandledError" Nov 26 12:56:19 crc kubenswrapper[4834]: E1126 12:56:19.489915 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \\\"http://38.102.83.98:5001/v2/\\\": dial tcp 38.102.83.98:5001: i/o timeout\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 12:56:20 crc kubenswrapper[4834]: I1126 12:56:20.414849 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" event={"ID":"8c0416e2-355c-4b19-b275-10d029919025","Type":"ContainerStarted","Data":"90c7e88e63f7edcf2c2231b7d4af03a1439fe27c3072d1a64ff25a9adc029ca0"} Nov 26 12:56:20 crc kubenswrapper[4834]: E1126 12:56:20.416833 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 12:56:21 crc kubenswrapper[4834]: E1126 12:56:21.424458 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 12:56:21 crc kubenswrapper[4834]: I1126 12:56:21.531034 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:56:21 crc kubenswrapper[4834]: I1126 12:56:21.531433 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:56:21 crc kubenswrapper[4834]: I1126 12:56:21.531476 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:56:21 crc kubenswrapper[4834]: I1126 12:56:21.532290 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fd522d7710ec6a6cae7b65384302f26fcdba1cdfb96e53555c07816508c637d"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:56:21 crc kubenswrapper[4834]: I1126 12:56:21.532371 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://9fd522d7710ec6a6cae7b65384302f26fcdba1cdfb96e53555c07816508c637d" gracePeriod=600 Nov 26 12:56:22 crc kubenswrapper[4834]: I1126 12:56:22.429716 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="9fd522d7710ec6a6cae7b65384302f26fcdba1cdfb96e53555c07816508c637d" exitCode=0 Nov 26 12:56:22 crc kubenswrapper[4834]: I1126 12:56:22.429777 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"9fd522d7710ec6a6cae7b65384302f26fcdba1cdfb96e53555c07816508c637d"} Nov 26 12:56:22 crc kubenswrapper[4834]: I1126 12:56:22.430018 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719"} Nov 26 12:56:22 crc kubenswrapper[4834]: I1126 12:56:22.430048 4834 scope.go:117] "RemoveContainer" containerID="1011806e33f31f233343e2f0285bd695b1ec9abee2a866defd0d700a72d96c88" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.783964 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q54tt/must-gather-njj5w"] Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.785779 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.790465 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-q54tt"/"openshift-service-ca.crt" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.791154 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-q54tt"/"kube-root-ca.crt" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.796593 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-q54tt"/"default-dockercfg-hrjmj" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.800941 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-q54tt/must-gather-njj5w"] Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.849920 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a4c19b6-76ea-4977-bac4-7f1406bee595-must-gather-output\") pod \"must-gather-njj5w\" (UID: \"1a4c19b6-76ea-4977-bac4-7f1406bee595\") " pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.850044 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcm4l\" (UniqueName: \"kubernetes.io/projected/1a4c19b6-76ea-4977-bac4-7f1406bee595-kube-api-access-vcm4l\") pod \"must-gather-njj5w\" (UID: \"1a4c19b6-76ea-4977-bac4-7f1406bee595\") " pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.952005 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a4c19b6-76ea-4977-bac4-7f1406bee595-must-gather-output\") pod \"must-gather-njj5w\" (UID: \"1a4c19b6-76ea-4977-bac4-7f1406bee595\") " pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.952122 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm4l\" (UniqueName: \"kubernetes.io/projected/1a4c19b6-76ea-4977-bac4-7f1406bee595-kube-api-access-vcm4l\") pod \"must-gather-njj5w\" (UID: \"1a4c19b6-76ea-4977-bac4-7f1406bee595\") " pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.952756 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a4c19b6-76ea-4977-bac4-7f1406bee595-must-gather-output\") pod \"must-gather-njj5w\" (UID: \"1a4c19b6-76ea-4977-bac4-7f1406bee595\") " pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 12:57:41 crc kubenswrapper[4834]: I1126 12:57:41.967457 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcm4l\" (UniqueName: \"kubernetes.io/projected/1a4c19b6-76ea-4977-bac4-7f1406bee595-kube-api-access-vcm4l\") pod \"must-gather-njj5w\" (UID: \"1a4c19b6-76ea-4977-bac4-7f1406bee595\") " pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 12:57:42 crc kubenswrapper[4834]: I1126 12:57:42.109719 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 12:57:42 crc kubenswrapper[4834]: I1126 12:57:42.496921 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-q54tt/must-gather-njj5w"] Nov 26 12:57:42 crc kubenswrapper[4834]: I1126 12:57:42.957495 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q54tt/must-gather-njj5w" event={"ID":"1a4c19b6-76ea-4977-bac4-7f1406bee595","Type":"ContainerStarted","Data":"390986913e77870484b167a0733ba15a4cef54e70f37a6db9500fc4c313d4f07"} Nov 26 12:57:45 crc kubenswrapper[4834]: I1126 12:57:45.980256 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q54tt/must-gather-njj5w" event={"ID":"1a4c19b6-76ea-4977-bac4-7f1406bee595","Type":"ContainerStarted","Data":"df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559"} Nov 26 12:57:46 crc kubenswrapper[4834]: I1126 12:57:46.988919 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q54tt/must-gather-njj5w" event={"ID":"1a4c19b6-76ea-4977-bac4-7f1406bee595","Type":"ContainerStarted","Data":"16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab"} Nov 26 12:57:47 crc kubenswrapper[4834]: I1126 12:57:47.005975 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-q54tt/must-gather-njj5w" podStartSLOduration=2.795652612 podStartE2EDuration="6.005961149s" podCreationTimestamp="2025-11-26 12:57:41 +0000 UTC" firstStartedPulling="2025-11-26 12:57:42.502128961 +0000 UTC m=+2760.409342314" lastFinishedPulling="2025-11-26 12:57:45.712437499 +0000 UTC m=+2763.619650851" observedRunningTime="2025-11-26 12:57:47.001358128 +0000 UTC m=+2764.908571480" watchObservedRunningTime="2025-11-26 12:57:47.005961149 +0000 UTC m=+2764.913174501" Nov 26 12:57:49 crc kubenswrapper[4834]: I1126 12:57:49.167250 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q54tt/crc-debug-v2v6t"] Nov 26 12:57:49 crc kubenswrapper[4834]: I1126 12:57:49.168960 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:57:49 crc kubenswrapper[4834]: I1126 12:57:49.200056 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a407638-68bd-46a2-9229-0c8e5449696d-host\") pod \"crc-debug-v2v6t\" (UID: \"7a407638-68bd-46a2-9229-0c8e5449696d\") " pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:57:49 crc kubenswrapper[4834]: I1126 12:57:49.200125 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbnt9\" (UniqueName: \"kubernetes.io/projected/7a407638-68bd-46a2-9229-0c8e5449696d-kube-api-access-sbnt9\") pod \"crc-debug-v2v6t\" (UID: \"7a407638-68bd-46a2-9229-0c8e5449696d\") " pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:57:49 crc kubenswrapper[4834]: I1126 12:57:49.301757 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a407638-68bd-46a2-9229-0c8e5449696d-host\") pod \"crc-debug-v2v6t\" (UID: \"7a407638-68bd-46a2-9229-0c8e5449696d\") " pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:57:49 crc kubenswrapper[4834]: I1126 12:57:49.302034 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbnt9\" (UniqueName: \"kubernetes.io/projected/7a407638-68bd-46a2-9229-0c8e5449696d-kube-api-access-sbnt9\") pod \"crc-debug-v2v6t\" (UID: \"7a407638-68bd-46a2-9229-0c8e5449696d\") " pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:57:49 crc kubenswrapper[4834]: I1126 12:57:49.301951 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a407638-68bd-46a2-9229-0c8e5449696d-host\") pod \"crc-debug-v2v6t\" (UID: \"7a407638-68bd-46a2-9229-0c8e5449696d\") " pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:57:49 crc kubenswrapper[4834]: I1126 12:57:49.323073 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbnt9\" (UniqueName: \"kubernetes.io/projected/7a407638-68bd-46a2-9229-0c8e5449696d-kube-api-access-sbnt9\") pod \"crc-debug-v2v6t\" (UID: \"7a407638-68bd-46a2-9229-0c8e5449696d\") " pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:57:49 crc kubenswrapper[4834]: I1126 12:57:49.484717 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:57:49 crc kubenswrapper[4834]: W1126 12:57:49.514509 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a407638_68bd_46a2_9229_0c8e5449696d.slice/crio-27387b0dfd347d3c0129d3831abf842a84b8aaa586f6b58193d7952b72331f0f WatchSource:0}: Error finding container 27387b0dfd347d3c0129d3831abf842a84b8aaa586f6b58193d7952b72331f0f: Status 404 returned error can't find the container with id 27387b0dfd347d3c0129d3831abf842a84b8aaa586f6b58193d7952b72331f0f Nov 26 12:57:50 crc kubenswrapper[4834]: I1126 12:57:50.010191 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q54tt/crc-debug-v2v6t" event={"ID":"7a407638-68bd-46a2-9229-0c8e5449696d","Type":"ContainerStarted","Data":"27387b0dfd347d3c0129d3831abf842a84b8aaa586f6b58193d7952b72331f0f"} Nov 26 12:58:00 crc kubenswrapper[4834]: I1126 12:58:00.089242 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q54tt/crc-debug-v2v6t" event={"ID":"7a407638-68bd-46a2-9229-0c8e5449696d","Type":"ContainerStarted","Data":"9800adcf768d6e465d67a52aad561531f1054d3958adb86aeeeb934a1e20022a"} Nov 26 12:58:00 crc kubenswrapper[4834]: I1126 12:58:00.108765 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-q54tt/crc-debug-v2v6t" podStartSLOduration=1.427210379 podStartE2EDuration="11.108746204s" podCreationTimestamp="2025-11-26 12:57:49 +0000 UTC" firstStartedPulling="2025-11-26 12:57:49.516642903 +0000 UTC m=+2767.423856256" lastFinishedPulling="2025-11-26 12:57:59.198178729 +0000 UTC m=+2777.105392081" observedRunningTime="2025-11-26 12:58:00.106281677 +0000 UTC m=+2778.013495029" watchObservedRunningTime="2025-11-26 12:58:00.108746204 +0000 UTC m=+2778.015959557" Nov 26 12:58:17 crc kubenswrapper[4834]: I1126 12:58:17.201052 4834 generic.go:334] "Generic (PLEG): container finished" podID="7a407638-68bd-46a2-9229-0c8e5449696d" containerID="9800adcf768d6e465d67a52aad561531f1054d3958adb86aeeeb934a1e20022a" exitCode=0 Nov 26 12:58:17 crc kubenswrapper[4834]: I1126 12:58:17.201123 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q54tt/crc-debug-v2v6t" event={"ID":"7a407638-68bd-46a2-9229-0c8e5449696d","Type":"ContainerDied","Data":"9800adcf768d6e465d67a52aad561531f1054d3958adb86aeeeb934a1e20022a"} Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.098827 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-477sx"] Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.100670 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.111484 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-477sx"] Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.162905 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-catalog-content\") pod \"redhat-marketplace-477sx\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.162941 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-utilities\") pod \"redhat-marketplace-477sx\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.163068 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdmbm\" (UniqueName: \"kubernetes.io/projected/c729be52-6a4e-4720-96eb-6d5421016044-kube-api-access-sdmbm\") pod \"redhat-marketplace-477sx\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.264874 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdmbm\" (UniqueName: \"kubernetes.io/projected/c729be52-6a4e-4720-96eb-6d5421016044-kube-api-access-sdmbm\") pod \"redhat-marketplace-477sx\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.265067 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-catalog-content\") pod \"redhat-marketplace-477sx\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.265092 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-utilities\") pod \"redhat-marketplace-477sx\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.265651 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-utilities\") pod \"redhat-marketplace-477sx\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.265708 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-catalog-content\") pod \"redhat-marketplace-477sx\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.286505 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdmbm\" (UniqueName: \"kubernetes.io/projected/c729be52-6a4e-4720-96eb-6d5421016044-kube-api-access-sdmbm\") pod \"redhat-marketplace-477sx\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.323261 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.348929 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q54tt/crc-debug-v2v6t"] Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.356457 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q54tt/crc-debug-v2v6t"] Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.366677 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbnt9\" (UniqueName: \"kubernetes.io/projected/7a407638-68bd-46a2-9229-0c8e5449696d-kube-api-access-sbnt9\") pod \"7a407638-68bd-46a2-9229-0c8e5449696d\" (UID: \"7a407638-68bd-46a2-9229-0c8e5449696d\") " Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.366759 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a407638-68bd-46a2-9229-0c8e5449696d-host\") pod \"7a407638-68bd-46a2-9229-0c8e5449696d\" (UID: \"7a407638-68bd-46a2-9229-0c8e5449696d\") " Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.366861 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a407638-68bd-46a2-9229-0c8e5449696d-host" (OuterVolumeSpecName: "host") pod "7a407638-68bd-46a2-9229-0c8e5449696d" (UID: "7a407638-68bd-46a2-9229-0c8e5449696d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.367498 4834 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a407638-68bd-46a2-9229-0c8e5449696d-host\") on node \"crc\" DevicePath \"\"" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.370429 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a407638-68bd-46a2-9229-0c8e5449696d-kube-api-access-sbnt9" (OuterVolumeSpecName: "kube-api-access-sbnt9") pod "7a407638-68bd-46a2-9229-0c8e5449696d" (UID: "7a407638-68bd-46a2-9229-0c8e5449696d"). InnerVolumeSpecName "kube-api-access-sbnt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.416515 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.424228 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a407638-68bd-46a2-9229-0c8e5449696d" path="/var/lib/kubelet/pods/7a407638-68bd-46a2-9229-0c8e5449696d/volumes" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.469150 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbnt9\" (UniqueName: \"kubernetes.io/projected/7a407638-68bd-46a2-9229-0c8e5449696d-kube-api-access-sbnt9\") on node \"crc\" DevicePath \"\"" Nov 26 12:58:18 crc kubenswrapper[4834]: I1126 12:58:18.802071 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-477sx"] Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.223703 4834 scope.go:117] "RemoveContainer" containerID="9800adcf768d6e465d67a52aad561531f1054d3958adb86aeeeb934a1e20022a" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.224220 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/crc-debug-v2v6t" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.228889 4834 generic.go:334] "Generic (PLEG): container finished" podID="c729be52-6a4e-4720-96eb-6d5421016044" containerID="b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473" exitCode=0 Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.228949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477sx" event={"ID":"c729be52-6a4e-4720-96eb-6d5421016044","Type":"ContainerDied","Data":"b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473"} Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.228988 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477sx" event={"ID":"c729be52-6a4e-4720-96eb-6d5421016044","Type":"ContainerStarted","Data":"30f77f3ae730d5d1c809473e5a51da2e3b085b59cb02dba887fbf8829f43b3ee"} Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.512385 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q54tt/crc-debug-wrrxn"] Nov 26 12:58:19 crc kubenswrapper[4834]: E1126 12:58:19.512733 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a407638-68bd-46a2-9229-0c8e5449696d" containerName="container-00" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.512746 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a407638-68bd-46a2-9229-0c8e5449696d" containerName="container-00" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.512921 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a407638-68bd-46a2-9229-0c8e5449696d" containerName="container-00" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.513522 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.585574 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6jkc\" (UniqueName: \"kubernetes.io/projected/f2007c3e-7e78-42c9-925a-b80c07223ebb-kube-api-access-k6jkc\") pod \"crc-debug-wrrxn\" (UID: \"f2007c3e-7e78-42c9-925a-b80c07223ebb\") " pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.585613 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2007c3e-7e78-42c9-925a-b80c07223ebb-host\") pod \"crc-debug-wrrxn\" (UID: \"f2007c3e-7e78-42c9-925a-b80c07223ebb\") " pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.687129 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6jkc\" (UniqueName: \"kubernetes.io/projected/f2007c3e-7e78-42c9-925a-b80c07223ebb-kube-api-access-k6jkc\") pod \"crc-debug-wrrxn\" (UID: \"f2007c3e-7e78-42c9-925a-b80c07223ebb\") " pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.687173 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2007c3e-7e78-42c9-925a-b80c07223ebb-host\") pod \"crc-debug-wrrxn\" (UID: \"f2007c3e-7e78-42c9-925a-b80c07223ebb\") " pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.687371 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2007c3e-7e78-42c9-925a-b80c07223ebb-host\") pod \"crc-debug-wrrxn\" (UID: \"f2007c3e-7e78-42c9-925a-b80c07223ebb\") " pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.705112 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6jkc\" (UniqueName: \"kubernetes.io/projected/f2007c3e-7e78-42c9-925a-b80c07223ebb-kube-api-access-k6jkc\") pod \"crc-debug-wrrxn\" (UID: \"f2007c3e-7e78-42c9-925a-b80c07223ebb\") " pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:19 crc kubenswrapper[4834]: I1126 12:58:19.825804 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:20 crc kubenswrapper[4834]: I1126 12:58:20.328575 4834 generic.go:334] "Generic (PLEG): container finished" podID="f2007c3e-7e78-42c9-925a-b80c07223ebb" containerID="324e15a6ee7a0544fdaff8a54580a1cd02382a7b673ff457bc21e7bbdf145ffd" exitCode=1 Nov 26 12:58:20 crc kubenswrapper[4834]: I1126 12:58:20.328909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q54tt/crc-debug-wrrxn" event={"ID":"f2007c3e-7e78-42c9-925a-b80c07223ebb","Type":"ContainerDied","Data":"324e15a6ee7a0544fdaff8a54580a1cd02382a7b673ff457bc21e7bbdf145ffd"} Nov 26 12:58:20 crc kubenswrapper[4834]: I1126 12:58:20.328942 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q54tt/crc-debug-wrrxn" event={"ID":"f2007c3e-7e78-42c9-925a-b80c07223ebb","Type":"ContainerStarted","Data":"308edb5c1a953c1a5f4ee7c6fafede002b87669bd66556d7886d7f1013707d6f"} Nov 26 12:58:20 crc kubenswrapper[4834]: I1126 12:58:20.371620 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q54tt/crc-debug-wrrxn"] Nov 26 12:58:20 crc kubenswrapper[4834]: I1126 12:58:20.379436 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q54tt/crc-debug-wrrxn"] Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.337572 4834 generic.go:334] "Generic (PLEG): container finished" podID="c729be52-6a4e-4720-96eb-6d5421016044" containerID="a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965" exitCode=0 Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.338030 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477sx" event={"ID":"c729be52-6a4e-4720-96eb-6d5421016044","Type":"ContainerDied","Data":"a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965"} Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.415034 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.440112 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2007c3e-7e78-42c9-925a-b80c07223ebb-host\") pod \"f2007c3e-7e78-42c9-925a-b80c07223ebb\" (UID: \"f2007c3e-7e78-42c9-925a-b80c07223ebb\") " Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.440219 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2007c3e-7e78-42c9-925a-b80c07223ebb-host" (OuterVolumeSpecName: "host") pod "f2007c3e-7e78-42c9-925a-b80c07223ebb" (UID: "f2007c3e-7e78-42c9-925a-b80c07223ebb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.440335 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6jkc\" (UniqueName: \"kubernetes.io/projected/f2007c3e-7e78-42c9-925a-b80c07223ebb-kube-api-access-k6jkc\") pod \"f2007c3e-7e78-42c9-925a-b80c07223ebb\" (UID: \"f2007c3e-7e78-42c9-925a-b80c07223ebb\") " Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.441125 4834 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2007c3e-7e78-42c9-925a-b80c07223ebb-host\") on node \"crc\" DevicePath \"\"" Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.446451 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2007c3e-7e78-42c9-925a-b80c07223ebb-kube-api-access-k6jkc" (OuterVolumeSpecName: "kube-api-access-k6jkc") pod "f2007c3e-7e78-42c9-925a-b80c07223ebb" (UID: "f2007c3e-7e78-42c9-925a-b80c07223ebb"). InnerVolumeSpecName "kube-api-access-k6jkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.531380 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.531445 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:58:21 crc kubenswrapper[4834]: I1126 12:58:21.543126 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6jkc\" (UniqueName: \"kubernetes.io/projected/f2007c3e-7e78-42c9-925a-b80c07223ebb-kube-api-access-k6jkc\") on node \"crc\" DevicePath \"\"" Nov 26 12:58:22 crc kubenswrapper[4834]: I1126 12:58:22.345879 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477sx" event={"ID":"c729be52-6a4e-4720-96eb-6d5421016044","Type":"ContainerStarted","Data":"0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f"} Nov 26 12:58:22 crc kubenswrapper[4834]: I1126 12:58:22.351072 4834 scope.go:117] "RemoveContainer" containerID="324e15a6ee7a0544fdaff8a54580a1cd02382a7b673ff457bc21e7bbdf145ffd" Nov 26 12:58:22 crc kubenswrapper[4834]: I1126 12:58:22.351132 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/crc-debug-wrrxn" Nov 26 12:58:22 crc kubenswrapper[4834]: I1126 12:58:22.366406 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-477sx" podStartSLOduration=1.674387794 podStartE2EDuration="4.36639021s" podCreationTimestamp="2025-11-26 12:58:18 +0000 UTC" firstStartedPulling="2025-11-26 12:58:19.231992312 +0000 UTC m=+2797.139205664" lastFinishedPulling="2025-11-26 12:58:21.923994727 +0000 UTC m=+2799.831208080" observedRunningTime="2025-11-26 12:58:22.361010965 +0000 UTC m=+2800.268224317" watchObservedRunningTime="2025-11-26 12:58:22.36639021 +0000 UTC m=+2800.273603561" Nov 26 12:58:22 crc kubenswrapper[4834]: I1126 12:58:22.425260 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2007c3e-7e78-42c9-925a-b80c07223ebb" path="/var/lib/kubelet/pods/f2007c3e-7e78-42c9-925a-b80c07223ebb/volumes" Nov 26 12:58:28 crc kubenswrapper[4834]: I1126 12:58:28.428165 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:28 crc kubenswrapper[4834]: I1126 12:58:28.428805 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:28 crc kubenswrapper[4834]: I1126 12:58:28.459652 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:29 crc kubenswrapper[4834]: I1126 12:58:29.427793 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:29 crc kubenswrapper[4834]: I1126 12:58:29.464072 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-477sx"] Nov 26 12:58:31 crc kubenswrapper[4834]: I1126 12:58:31.407944 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-477sx" podUID="c729be52-6a4e-4720-96eb-6d5421016044" containerName="registry-server" containerID="cri-o://0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f" gracePeriod=2 Nov 26 12:58:31 crc kubenswrapper[4834]: I1126 12:58:31.778012 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:31 crc kubenswrapper[4834]: I1126 12:58:31.915935 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-utilities\") pod \"c729be52-6a4e-4720-96eb-6d5421016044\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " Nov 26 12:58:31 crc kubenswrapper[4834]: I1126 12:58:31.915972 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdmbm\" (UniqueName: \"kubernetes.io/projected/c729be52-6a4e-4720-96eb-6d5421016044-kube-api-access-sdmbm\") pod \"c729be52-6a4e-4720-96eb-6d5421016044\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " Nov 26 12:58:31 crc kubenswrapper[4834]: I1126 12:58:31.916184 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-catalog-content\") pod \"c729be52-6a4e-4720-96eb-6d5421016044\" (UID: \"c729be52-6a4e-4720-96eb-6d5421016044\") " Nov 26 12:58:31 crc kubenswrapper[4834]: I1126 12:58:31.916801 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-utilities" (OuterVolumeSpecName: "utilities") pod "c729be52-6a4e-4720-96eb-6d5421016044" (UID: "c729be52-6a4e-4720-96eb-6d5421016044"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:58:31 crc kubenswrapper[4834]: I1126 12:58:31.920434 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c729be52-6a4e-4720-96eb-6d5421016044-kube-api-access-sdmbm" (OuterVolumeSpecName: "kube-api-access-sdmbm") pod "c729be52-6a4e-4720-96eb-6d5421016044" (UID: "c729be52-6a4e-4720-96eb-6d5421016044"). InnerVolumeSpecName "kube-api-access-sdmbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:58:31 crc kubenswrapper[4834]: I1126 12:58:31.928196 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c729be52-6a4e-4720-96eb-6d5421016044" (UID: "c729be52-6a4e-4720-96eb-6d5421016044"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.018128 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.018158 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdmbm\" (UniqueName: \"kubernetes.io/projected/c729be52-6a4e-4720-96eb-6d5421016044-kube-api-access-sdmbm\") on node \"crc\" DevicePath \"\"" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.018168 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c729be52-6a4e-4720-96eb-6d5421016044-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.417232 4834 generic.go:334] "Generic (PLEG): container finished" podID="c729be52-6a4e-4720-96eb-6d5421016044" containerID="0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f" exitCode=0 Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.421342 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-477sx" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.424342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477sx" event={"ID":"c729be52-6a4e-4720-96eb-6d5421016044","Type":"ContainerDied","Data":"0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f"} Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.424383 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477sx" event={"ID":"c729be52-6a4e-4720-96eb-6d5421016044","Type":"ContainerDied","Data":"30f77f3ae730d5d1c809473e5a51da2e3b085b59cb02dba887fbf8829f43b3ee"} Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.424404 4834 scope.go:117] "RemoveContainer" containerID="0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.438965 4834 scope.go:117] "RemoveContainer" containerID="a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.453851 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-477sx"] Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.460224 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-477sx"] Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.467808 4834 scope.go:117] "RemoveContainer" containerID="b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.485643 4834 scope.go:117] "RemoveContainer" containerID="0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f" Nov 26 12:58:32 crc kubenswrapper[4834]: E1126 12:58:32.485900 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f\": container with ID starting with 0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f not found: ID does not exist" containerID="0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.485935 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f"} err="failed to get container status \"0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f\": rpc error: code = NotFound desc = could not find container \"0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f\": container with ID starting with 0bb0deaac86eee4d7b8de8dc4d19adda0273ce49ed3783386b5781d35dc6189f not found: ID does not exist" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.485957 4834 scope.go:117] "RemoveContainer" containerID="a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965" Nov 26 12:58:32 crc kubenswrapper[4834]: E1126 12:58:32.486429 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965\": container with ID starting with a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965 not found: ID does not exist" containerID="a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.486463 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965"} err="failed to get container status \"a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965\": rpc error: code = NotFound desc = could not find container \"a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965\": container with ID starting with a2525425a49976df58d863b6739f6876d64c4699ae4337682c9cbe927e73c965 not found: ID does not exist" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.486489 4834 scope.go:117] "RemoveContainer" containerID="b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473" Nov 26 12:58:32 crc kubenswrapper[4834]: E1126 12:58:32.487655 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473\": container with ID starting with b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473 not found: ID does not exist" containerID="b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473" Nov 26 12:58:32 crc kubenswrapper[4834]: I1126 12:58:32.487703 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473"} err="failed to get container status \"b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473\": rpc error: code = NotFound desc = could not find container \"b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473\": container with ID starting with b1daf129a6a247774c28b3519363046ee14049f1fcd517e974876e0cea72a473 not found: ID does not exist" Nov 26 12:58:33 crc kubenswrapper[4834]: E1126 12:58:33.423938 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" image="38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517" Nov 26 12:58:33 crc kubenswrapper[4834]: E1126 12:58:33.424140 4834 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" image="38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517" Nov 26 12:58:33 crc kubenswrapper[4834]: E1126 12:58:33.424259 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wm75x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6f76cf5bc4-m745d_openstack-operators(8c0416e2-355c-4b19-b275-10d029919025): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" logger="UnhandledError" Nov 26 12:58:33 crc kubenswrapper[4834]: E1126 12:58:33.426127 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \\\"http://38.102.83.98:5001/v2/\\\": dial tcp 38.102.83.98:5001: i/o timeout\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 12:58:34 crc kubenswrapper[4834]: I1126 12:58:34.425546 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c729be52-6a4e-4720-96eb-6d5421016044" path="/var/lib/kubelet/pods/c729be52-6a4e-4720-96eb-6d5421016044/volumes" Nov 26 12:58:45 crc kubenswrapper[4834]: E1126 12:58:45.418556 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 12:58:51 crc kubenswrapper[4834]: I1126 12:58:51.531295 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:58:51 crc kubenswrapper[4834]: I1126 12:58:51.531920 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:59:05 crc kubenswrapper[4834]: I1126 12:59:05.420930 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78b8cf7fb4-c2jnd_92800ecd-26fe-48c0-92fc-46d652fe0480/barbican-api/0.log" Nov 26 12:59:05 crc kubenswrapper[4834]: I1126 12:59:05.556234 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78b8cf7fb4-c2jnd_92800ecd-26fe-48c0-92fc-46d652fe0480/barbican-api-log/0.log" Nov 26 12:59:05 crc kubenswrapper[4834]: I1126 12:59:05.571496 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7fc459bb58-nxgpv_6dd7a932-530b-46cd-bfd6-d26679708721/barbican-keystone-listener/0.log" Nov 26 12:59:05 crc kubenswrapper[4834]: I1126 12:59:05.642258 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7fc459bb58-nxgpv_6dd7a932-530b-46cd-bfd6-d26679708721/barbican-keystone-listener-log/0.log" Nov 26 12:59:05 crc kubenswrapper[4834]: I1126 12:59:05.728026 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7459d949b5-hx42r_1d471582-dd4e-4b07-b99c-16196de70224/barbican-worker/0.log" Nov 26 12:59:05 crc kubenswrapper[4834]: I1126 12:59:05.768893 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7459d949b5-hx42r_1d471582-dd4e-4b07-b99c-16196de70224/barbican-worker-log/0.log" Nov 26 12:59:05 crc kubenswrapper[4834]: I1126 12:59:05.921110 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-z6466_d7c68a58-7124-4e32-b9d5-a2d0154c63dd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:05 crc kubenswrapper[4834]: I1126 12:59:05.930326 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7ca87460-b55b-4744-af3d-37a0a9662784/ceilometer-central-agent/0.log" Nov 26 12:59:05 crc kubenswrapper[4834]: I1126 12:59:05.988936 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7ca87460-b55b-4744-af3d-37a0a9662784/ceilometer-notification-agent/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.079724 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7ca87460-b55b-4744-af3d-37a0a9662784/proxy-httpd/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.086026 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7ca87460-b55b-4744-af3d-37a0a9662784/sg-core/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.142906 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-j49br_f3175863-4ff8-4c14-82c8-2a1ae6bccf38/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.231300 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h86nd_85f0a2af-a310-4854-adec-f40a66bb0ba3/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.333846 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5845ee5-aed4-4450-a04c-2e25bd2dc0f2/cinder-api/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.400333 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5845ee5-aed4-4450-a04c-2e25bd2dc0f2/cinder-api-log/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.519809 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e978c710-b8cc-4608-a8b7-32619386447c/probe/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.535885 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e978c710-b8cc-4608-a8b7-32619386447c/cinder-backup/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.644889 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60fea7de-82b0-4107-b0cb-07f09bbd2341/cinder-scheduler/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.692476 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60fea7de-82b0-4107-b0cb-07f09bbd2341/probe/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.821740 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_475f5c3e-098e-4a99-83da-c2513b5d0ed7/cinder-volume/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.823433 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_475f5c3e-098e-4a99-83da-c2513b5d0ed7/probe/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.887025 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vsq5r_0c3d3815-4dbf-4add-a702-d215553b2bbb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:06 crc kubenswrapper[4834]: I1126 12:59:06.982830 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cnwqb_c9dc53e0-b81a-427b-9459-9cef85f9be65/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.074355 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78f48d6b7c-vglzt_94a582ef-1398-4fa2-afa2-2627ebc94e06/init/0.log" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.140092 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c5mvp"] Nov 26 12:59:07 crc kubenswrapper[4834]: E1126 12:59:07.140424 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2007c3e-7e78-42c9-925a-b80c07223ebb" containerName="container-00" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.140435 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2007c3e-7e78-42c9-925a-b80c07223ebb" containerName="container-00" Nov 26 12:59:07 crc kubenswrapper[4834]: E1126 12:59:07.140460 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c729be52-6a4e-4720-96eb-6d5421016044" containerName="registry-server" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.140465 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c729be52-6a4e-4720-96eb-6d5421016044" containerName="registry-server" Nov 26 12:59:07 crc kubenswrapper[4834]: E1126 12:59:07.140482 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c729be52-6a4e-4720-96eb-6d5421016044" containerName="extract-content" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.140488 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c729be52-6a4e-4720-96eb-6d5421016044" containerName="extract-content" Nov 26 12:59:07 crc kubenswrapper[4834]: E1126 12:59:07.140497 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c729be52-6a4e-4720-96eb-6d5421016044" containerName="extract-utilities" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.140502 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c729be52-6a4e-4720-96eb-6d5421016044" containerName="extract-utilities" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.140739 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2007c3e-7e78-42c9-925a-b80c07223ebb" containerName="container-00" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.140760 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c729be52-6a4e-4720-96eb-6d5421016044" containerName="registry-server" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.141944 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.152139 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c5mvp"] Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.196594 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-utilities\") pod \"community-operators-c5mvp\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.196643 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg79x\" (UniqueName: \"kubernetes.io/projected/736927ed-2a7f-474b-b9fb-893a009226af-kube-api-access-jg79x\") pod \"community-operators-c5mvp\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.197373 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-catalog-content\") pod \"community-operators-c5mvp\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.234473 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fdfe0239-1020-40b2-9031-02cd04267ad0/glance-httpd/0.log" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.236720 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78f48d6b7c-vglzt_94a582ef-1398-4fa2-afa2-2627ebc94e06/init/0.log" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.300029 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-catalog-content\") pod \"community-operators-c5mvp\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.300108 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-utilities\") pod \"community-operators-c5mvp\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.300154 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg79x\" (UniqueName: \"kubernetes.io/projected/736927ed-2a7f-474b-b9fb-893a009226af-kube-api-access-jg79x\") pod \"community-operators-c5mvp\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.300594 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-catalog-content\") pod \"community-operators-c5mvp\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.300647 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-utilities\") pod \"community-operators-c5mvp\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.317562 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg79x\" (UniqueName: \"kubernetes.io/projected/736927ed-2a7f-474b-b9fb-893a009226af-kube-api-access-jg79x\") pod \"community-operators-c5mvp\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.322398 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78f48d6b7c-vglzt_94a582ef-1398-4fa2-afa2-2627ebc94e06/dnsmasq-dns/0.log" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.457386 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.463356 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fdfe0239-1020-40b2-9031-02cd04267ad0/glance-log/0.log" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.560137 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_659dc5b8-db0c-47cd-9c94-ba96b4256129/glance-log/0.log" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.654774 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_659dc5b8-db0c-47cd-9c94-ba96b4256129/glance-httpd/0.log" Nov 26 12:59:07 crc kubenswrapper[4834]: I1126 12:59:07.992928 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c5mvp"] Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.040053 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5895978f64-t9cvb_9347056f-b194-406f-8b28-bd86ee220403/horizon/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.065898 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5895978f64-t9cvb_9347056f-b194-406f-8b28-bd86ee220403/horizon-log/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.145945 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bkgzr_797b63c9-b38e-407e-b2a2-8999021770ce/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.218016 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-575pl_10689f77-1926-4fc1-be46-ab8c79eb3d11/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.406262 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-595cbdb8c4-fwh2n_be2722bc-d66d-49d8-8966-769edf761453/keystone-api/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.408909 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4be94615-ce61-41ff-a0e1-5fe4851d42ea/kube-state-metrics/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.569675 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-77jlt_db179b40-80be-4e46-9f35-677180198b4e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.633211 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-246d-account-create-update-9jk9p_915f4671-4ace-4684-b095-75ee89fc9c7b/mariadb-account-create-update/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.689181 4834 generic.go:334] "Generic (PLEG): container finished" podID="736927ed-2a7f-474b-b9fb-893a009226af" containerID="a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599" exitCode=0 Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.689219 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5mvp" event={"ID":"736927ed-2a7f-474b-b9fb-893a009226af","Type":"ContainerDied","Data":"a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599"} Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.689243 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5mvp" event={"ID":"736927ed-2a7f-474b-b9fb-893a009226af","Type":"ContainerStarted","Data":"da113dc487c8f99597e541aa88a2d1abcb13079123bfc175627178dfdcf1535c"} Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.769801 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_9dcf49f7-938b-417a-91c0-52cbd58f8c62/manila-api-log/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.817565 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_9dcf49f7-938b-417a-91c0-52cbd58f8c62/manila-api/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.829759 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-create-b2qmr_69f2a300-39e4-4bfc-bb9a-5646fe44709c/mariadb-database-create/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.943344 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-sync-4bj4f_b67b910b-d864-4d0f-9f34-921e1cdd0517/manila-db-sync/0.log" Nov 26 12:59:08 crc kubenswrapper[4834]: I1126 12:59:08.989909 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8e0a711b-a1c2-474b-87fb-729aaf8d00d7/probe/0.log" Nov 26 12:59:09 crc kubenswrapper[4834]: I1126 12:59:09.043823 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8e0a711b-a1c2-474b-87fb-729aaf8d00d7/manila-scheduler/0.log" Nov 26 12:59:09 crc kubenswrapper[4834]: I1126 12:59:09.133944 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_629fb482-78c1-4e80-96c8-bfd0ec24993d/manila-share/0.log" Nov 26 12:59:09 crc kubenswrapper[4834]: I1126 12:59:09.172026 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_629fb482-78c1-4e80-96c8-bfd0ec24993d/probe/0.log" Nov 26 12:59:09 crc kubenswrapper[4834]: I1126 12:59:09.338044 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-bcb769647-fxvwp_c2ac86af-2073-466e-ad2b-5203fbd8036f/neutron-api/0.log" Nov 26 12:59:09 crc kubenswrapper[4834]: I1126 12:59:09.386346 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-bcb769647-fxvwp_c2ac86af-2073-466e-ad2b-5203fbd8036f/neutron-httpd/0.log" Nov 26 12:59:09 crc kubenswrapper[4834]: I1126 12:59:09.530890 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zp6bm_90cc7502-5ac6-4d61-a70d-d8bc10da96b7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:09 crc kubenswrapper[4834]: I1126 12:59:09.780193 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9a65c5e1-ffb4-429f-a58b-2e79a51acc6a/nova-api-log/0.log" Nov 26 12:59:09 crc kubenswrapper[4834]: I1126 12:59:09.900539 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f7e75301-90f4-4498-b64f-027c0fc4b257/nova-cell0-conductor-conductor/0.log" Nov 26 12:59:09 crc kubenswrapper[4834]: I1126 12:59:09.929427 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9a65c5e1-ffb4-429f-a58b-2e79a51acc6a/nova-api-api/0.log" Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.093962 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8c1632f2-4571-4112-83e2-c0bc5fa90d3e/nova-cell1-conductor-conductor/0.log" Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.194959 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ff344ee7-ddb0-4286-bb4b-dd1b27e8c710/nova-cell1-novncproxy-novncproxy/0.log" Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.348356 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fsjpz_c3dd49ba-b6b6-4dd2-be83-7c04977d2b17/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.352744 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_56f1d827-26b6-46e8-8d2a-0559e7883478/nova-metadata-log/0.log" Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.680126 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4c5ced50-7529-4c2a-822b-0a10cf6a9700/mysql-bootstrap/0.log" Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.698008 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_774b42be-ec58-4678-b226-e590f1367ed2/nova-scheduler-scheduler/0.log" Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.709587 4834 generic.go:334] "Generic (PLEG): container finished" podID="736927ed-2a7f-474b-b9fb-893a009226af" containerID="fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1" exitCode=0 Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.709641 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5mvp" event={"ID":"736927ed-2a7f-474b-b9fb-893a009226af","Type":"ContainerDied","Data":"fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1"} Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.854880 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4c5ced50-7529-4c2a-822b-0a10cf6a9700/mysql-bootstrap/0.log" Nov 26 12:59:10 crc kubenswrapper[4834]: I1126 12:59:10.885982 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4c5ced50-7529-4c2a-822b-0a10cf6a9700/galera/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.047647 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e6b694cd-3381-4f15-8d74-8cfc72753ae3/mysql-bootstrap/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.172953 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_56f1d827-26b6-46e8-8d2a-0559e7883478/nova-metadata-metadata/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.252122 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e6b694cd-3381-4f15-8d74-8cfc72753ae3/mysql-bootstrap/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.262691 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e6b694cd-3381-4f15-8d74-8cfc72753ae3/galera/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.353384 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_61d6f7eb-6f11-4c69-b31f-75701ac020c2/openstackclient/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.472867 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-n9hx4_2217b645-a751-42b8-be14-6587b294bf48/openstack-network-exporter/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.554604 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-n6tm2_b0ec582c-ead4-4350-ac7f-530f80804717/ovn-controller/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.666099 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7b48f_dc1e8363-c0e7-428e-b215-5d246d6c5094/ovsdb-server-init/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.722539 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5mvp" event={"ID":"736927ed-2a7f-474b-b9fb-893a009226af","Type":"ContainerStarted","Data":"4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e"} Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.749168 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c5mvp" podStartSLOduration=2.188475795 podStartE2EDuration="4.749149539s" podCreationTimestamp="2025-11-26 12:59:07 +0000 UTC" firstStartedPulling="2025-11-26 12:59:08.690919976 +0000 UTC m=+2846.598133328" lastFinishedPulling="2025-11-26 12:59:11.251593721 +0000 UTC m=+2849.158807072" observedRunningTime="2025-11-26 12:59:11.741059482 +0000 UTC m=+2849.648272833" watchObservedRunningTime="2025-11-26 12:59:11.749149539 +0000 UTC m=+2849.656362891" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.869539 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7b48f_dc1e8363-c0e7-428e-b215-5d246d6c5094/ovsdb-server-init/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.875888 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7b48f_dc1e8363-c0e7-428e-b215-5d246d6c5094/ovs-vswitchd/0.log" Nov 26 12:59:11 crc kubenswrapper[4834]: I1126 12:59:11.925647 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7b48f_dc1e8363-c0e7-428e-b215-5d246d6c5094/ovsdb-server/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.099153 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-m2cd4_21eb26c7-797e-4fa5-aa7f-b9acc565e6b6/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.118857 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbd7948b-27ac-4472-a49b-23eeac081fb9/openstack-network-exporter/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.176729 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbd7948b-27ac-4472-a49b-23eeac081fb9/ovn-northd/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.306889 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6a511537-2e50-4f68-9c68-dcb20e489cb3/ovsdbserver-nb/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.381999 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6a511537-2e50-4f68-9c68-dcb20e489cb3/openstack-network-exporter/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.493818 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2b8fa876-0579-4ff1-be1c-4a45969fa4ae/openstack-network-exporter/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.542038 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2b8fa876-0579-4ff1-be1c-4a45969fa4ae/ovsdbserver-sb/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.668169 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8685c85bd8-7kkqj_7889c8e4-2ada-4349-bf82-f177d34a3ad7/placement-api/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.713477 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8685c85bd8-7kkqj_7889c8e4-2ada-4349-bf82-f177d34a3ad7/placement-log/0.log" Nov 26 12:59:12 crc kubenswrapper[4834]: I1126 12:59:12.790496 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0aa91ce0-4843-4e7b-b02c-4cc94d001abd/setup-container/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.023248 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0aa91ce0-4843-4e7b-b02c-4cc94d001abd/setup-container/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.060215 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0aa91ce0-4843-4e7b-b02c-4cc94d001abd/rabbitmq/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.078256 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3c521a82-8cae-4279-b12f-958ce3470c54/setup-container/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.272070 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3c521a82-8cae-4279-b12f-958ce3470c54/rabbitmq/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.310040 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-686bp_d0280880-1382-46e5-9c25-f9c3b9d8f3bd/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.319594 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3c521a82-8cae-4279-b12f-958ce3470c54/setup-container/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.514839 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-5pmfn_5492b71a-1903-424a-a46b-afe0e6713024/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.582061 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-4lsrq_851a4e1d-cfa4-495d-ac61-cce63eff30bc/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.684498 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-lzj67_3ec594bd-dbbe-4439-b255-798ecb061060/ssh-known-hosts-edpm-deployment/0.log" Nov 26 12:59:13 crc kubenswrapper[4834]: I1126 12:59:13.768047 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6x2db_0b19933f-0934-423b-b0f8-a25f6180446e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 26 12:59:17 crc kubenswrapper[4834]: I1126 12:59:17.457883 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:17 crc kubenswrapper[4834]: I1126 12:59:17.458349 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:17 crc kubenswrapper[4834]: I1126 12:59:17.501559 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:17 crc kubenswrapper[4834]: I1126 12:59:17.825280 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:17 crc kubenswrapper[4834]: I1126 12:59:17.870205 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c5mvp"] Nov 26 12:59:18 crc kubenswrapper[4834]: I1126 12:59:18.835914 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_614cfce6-4cb6-46ed-9012-a2ff7faf0a64/memcached/0.log" Nov 26 12:59:19 crc kubenswrapper[4834]: I1126 12:59:19.767892 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c5mvp" podUID="736927ed-2a7f-474b-b9fb-893a009226af" containerName="registry-server" containerID="cri-o://4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e" gracePeriod=2 Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.144677 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f2bcg"] Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.147038 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.152285 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln4gh\" (UniqueName: \"kubernetes.io/projected/a0dcde94-18e0-4b56-9177-fb652b9d16f5-kube-api-access-ln4gh\") pod \"redhat-operators-f2bcg\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.152363 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-catalog-content\") pod \"redhat-operators-f2bcg\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.152387 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-utilities\") pod \"redhat-operators-f2bcg\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.156273 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f2bcg"] Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.180926 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.253657 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-catalog-content\") pod \"736927ed-2a7f-474b-b9fb-893a009226af\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.253727 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-utilities\") pod \"736927ed-2a7f-474b-b9fb-893a009226af\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.253806 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg79x\" (UniqueName: \"kubernetes.io/projected/736927ed-2a7f-474b-b9fb-893a009226af-kube-api-access-jg79x\") pod \"736927ed-2a7f-474b-b9fb-893a009226af\" (UID: \"736927ed-2a7f-474b-b9fb-893a009226af\") " Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.254026 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln4gh\" (UniqueName: \"kubernetes.io/projected/a0dcde94-18e0-4b56-9177-fb652b9d16f5-kube-api-access-ln4gh\") pod \"redhat-operators-f2bcg\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.254107 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-catalog-content\") pod \"redhat-operators-f2bcg\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.254137 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-utilities\") pod \"redhat-operators-f2bcg\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.254439 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-utilities" (OuterVolumeSpecName: "utilities") pod "736927ed-2a7f-474b-b9fb-893a009226af" (UID: "736927ed-2a7f-474b-b9fb-893a009226af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.254518 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-catalog-content\") pod \"redhat-operators-f2bcg\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.254669 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.254665 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-utilities\") pod \"redhat-operators-f2bcg\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.262192 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736927ed-2a7f-474b-b9fb-893a009226af-kube-api-access-jg79x" (OuterVolumeSpecName: "kube-api-access-jg79x") pod "736927ed-2a7f-474b-b9fb-893a009226af" (UID: "736927ed-2a7f-474b-b9fb-893a009226af"). InnerVolumeSpecName "kube-api-access-jg79x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.270981 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln4gh\" (UniqueName: \"kubernetes.io/projected/a0dcde94-18e0-4b56-9177-fb652b9d16f5-kube-api-access-ln4gh\") pod \"redhat-operators-f2bcg\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.301738 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "736927ed-2a7f-474b-b9fb-893a009226af" (UID: "736927ed-2a7f-474b-b9fb-893a009226af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.356161 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736927ed-2a7f-474b-b9fb-893a009226af-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.356465 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg79x\" (UniqueName: \"kubernetes.io/projected/736927ed-2a7f-474b-b9fb-893a009226af-kube-api-access-jg79x\") on node \"crc\" DevicePath \"\"" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.489488 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.780714 4834 generic.go:334] "Generic (PLEG): container finished" podID="736927ed-2a7f-474b-b9fb-893a009226af" containerID="4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e" exitCode=0 Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.780955 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5mvp" event={"ID":"736927ed-2a7f-474b-b9fb-893a009226af","Type":"ContainerDied","Data":"4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e"} Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.780978 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5mvp" event={"ID":"736927ed-2a7f-474b-b9fb-893a009226af","Type":"ContainerDied","Data":"da113dc487c8f99597e541aa88a2d1abcb13079123bfc175627178dfdcf1535c"} Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.780995 4834 scope.go:117] "RemoveContainer" containerID="4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.781101 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c5mvp" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.800854 4834 scope.go:117] "RemoveContainer" containerID="fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.803217 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c5mvp"] Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.810759 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c5mvp"] Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.823346 4834 scope.go:117] "RemoveContainer" containerID="a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.838032 4834 scope.go:117] "RemoveContainer" containerID="4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e" Nov 26 12:59:20 crc kubenswrapper[4834]: E1126 12:59:20.838441 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e\": container with ID starting with 4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e not found: ID does not exist" containerID="4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.838470 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e"} err="failed to get container status \"4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e\": rpc error: code = NotFound desc = could not find container \"4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e\": container with ID starting with 4489d53f56daa24cd77a5800523908d2ff6f157e96edd13d7b045fbf5238c17e not found: ID does not exist" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.838488 4834 scope.go:117] "RemoveContainer" containerID="fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1" Nov 26 12:59:20 crc kubenswrapper[4834]: E1126 12:59:20.839461 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1\": container with ID starting with fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1 not found: ID does not exist" containerID="fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.839491 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1"} err="failed to get container status \"fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1\": rpc error: code = NotFound desc = could not find container \"fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1\": container with ID starting with fd946645d76fae8d57039dfd5e9052ebc8b84c79da5d04597315eadd28011ee1 not found: ID does not exist" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.839510 4834 scope.go:117] "RemoveContainer" containerID="a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599" Nov 26 12:59:20 crc kubenswrapper[4834]: E1126 12:59:20.839810 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599\": container with ID starting with a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599 not found: ID does not exist" containerID="a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.839860 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599"} err="failed to get container status \"a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599\": rpc error: code = NotFound desc = could not find container \"a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599\": container with ID starting with a7873f7d03fadec3528614a949f39ddab382f9e9c2f2a4b7d37759900154c599 not found: ID does not exist" Nov 26 12:59:20 crc kubenswrapper[4834]: I1126 12:59:20.900595 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f2bcg"] Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.531231 4834 patch_prober.go:28] interesting pod/machine-config-daemon-xzb52 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.531292 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.531371 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.532247 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719"} pod="openshift-machine-config-operator/machine-config-daemon-xzb52" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.532292 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerName="machine-config-daemon" containerID="cri-o://0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" gracePeriod=600 Nov 26 12:59:21 crc kubenswrapper[4834]: E1126 12:59:21.651687 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.791272 4834 generic.go:334] "Generic (PLEG): container finished" podID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerID="c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f" exitCode=0 Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.791343 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2bcg" event={"ID":"a0dcde94-18e0-4b56-9177-fb652b9d16f5","Type":"ContainerDied","Data":"c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f"} Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.791573 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2bcg" event={"ID":"a0dcde94-18e0-4b56-9177-fb652b9d16f5","Type":"ContainerStarted","Data":"430eb252c11a99bee015fb307177377d2adf5c431e7b0fbb49a38f8703c743f6"} Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.792769 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.795690 4834 generic.go:334] "Generic (PLEG): container finished" podID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" exitCode=0 Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.795715 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerDied","Data":"0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719"} Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.795736 4834 scope.go:117] "RemoveContainer" containerID="9fd522d7710ec6a6cae7b65384302f26fcdba1cdfb96e53555c07816508c637d" Nov 26 12:59:21 crc kubenswrapper[4834]: I1126 12:59:21.796175 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 12:59:21 crc kubenswrapper[4834]: E1126 12:59:21.796429 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:59:22 crc kubenswrapper[4834]: I1126 12:59:22.426406 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736927ed-2a7f-474b-b9fb-893a009226af" path="/var/lib/kubelet/pods/736927ed-2a7f-474b-b9fb-893a009226af/volumes" Nov 26 12:59:22 crc kubenswrapper[4834]: I1126 12:59:22.814600 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2bcg" event={"ID":"a0dcde94-18e0-4b56-9177-fb652b9d16f5","Type":"ContainerStarted","Data":"e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25"} Nov 26 12:59:23 crc kubenswrapper[4834]: I1126 12:59:23.823139 4834 generic.go:334] "Generic (PLEG): container finished" podID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerID="e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25" exitCode=0 Nov 26 12:59:23 crc kubenswrapper[4834]: I1126 12:59:23.823193 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2bcg" event={"ID":"a0dcde94-18e0-4b56-9177-fb652b9d16f5","Type":"ContainerDied","Data":"e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25"} Nov 26 12:59:24 crc kubenswrapper[4834]: I1126 12:59:24.839763 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2bcg" event={"ID":"a0dcde94-18e0-4b56-9177-fb652b9d16f5","Type":"ContainerStarted","Data":"7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b"} Nov 26 12:59:24 crc kubenswrapper[4834]: I1126 12:59:24.862606 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f2bcg" podStartSLOduration=2.178669956 podStartE2EDuration="4.862587284s" podCreationTimestamp="2025-11-26 12:59:20 +0000 UTC" firstStartedPulling="2025-11-26 12:59:21.792535607 +0000 UTC m=+2859.699748959" lastFinishedPulling="2025-11-26 12:59:24.476452935 +0000 UTC m=+2862.383666287" observedRunningTime="2025-11-26 12:59:24.854413178 +0000 UTC m=+2862.761626530" watchObservedRunningTime="2025-11-26 12:59:24.862587284 +0000 UTC m=+2862.769800637" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.537849 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qqsph"] Nov 26 12:59:26 crc kubenswrapper[4834]: E1126 12:59:26.538609 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736927ed-2a7f-474b-b9fb-893a009226af" containerName="extract-content" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.538626 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="736927ed-2a7f-474b-b9fb-893a009226af" containerName="extract-content" Nov 26 12:59:26 crc kubenswrapper[4834]: E1126 12:59:26.538656 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736927ed-2a7f-474b-b9fb-893a009226af" containerName="extract-utilities" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.538662 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="736927ed-2a7f-474b-b9fb-893a009226af" containerName="extract-utilities" Nov 26 12:59:26 crc kubenswrapper[4834]: E1126 12:59:26.538677 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736927ed-2a7f-474b-b9fb-893a009226af" containerName="registry-server" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.538683 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="736927ed-2a7f-474b-b9fb-893a009226af" containerName="registry-server" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.538912 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="736927ed-2a7f-474b-b9fb-893a009226af" containerName="registry-server" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.540281 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.550481 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qqsph"] Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.596330 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-catalog-content\") pod \"certified-operators-qqsph\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.596504 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm7n6\" (UniqueName: \"kubernetes.io/projected/ab9ab591-ebd6-478f-8f45-2278a8d1f539-kube-api-access-dm7n6\") pod \"certified-operators-qqsph\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.596583 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-utilities\") pod \"certified-operators-qqsph\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.699077 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm7n6\" (UniqueName: \"kubernetes.io/projected/ab9ab591-ebd6-478f-8f45-2278a8d1f539-kube-api-access-dm7n6\") pod \"certified-operators-qqsph\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.699126 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-utilities\") pod \"certified-operators-qqsph\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.699252 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-catalog-content\") pod \"certified-operators-qqsph\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.699698 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-catalog-content\") pod \"certified-operators-qqsph\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.699789 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-utilities\") pod \"certified-operators-qqsph\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.716971 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm7n6\" (UniqueName: \"kubernetes.io/projected/ab9ab591-ebd6-478f-8f45-2278a8d1f539-kube-api-access-dm7n6\") pod \"certified-operators-qqsph\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:26 crc kubenswrapper[4834]: I1126 12:59:26.862983 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:27 crc kubenswrapper[4834]: I1126 12:59:27.345988 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qqsph"] Nov 26 12:59:27 crc kubenswrapper[4834]: W1126 12:59:27.347264 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab9ab591_ebd6_478f_8f45_2278a8d1f539.slice/crio-ce4d1b6eb6a50871732690ce9bf7c2f93389de718b715ba4614e8750fc57438b WatchSource:0}: Error finding container ce4d1b6eb6a50871732690ce9bf7c2f93389de718b715ba4614e8750fc57438b: Status 404 returned error can't find the container with id ce4d1b6eb6a50871732690ce9bf7c2f93389de718b715ba4614e8750fc57438b Nov 26 12:59:27 crc kubenswrapper[4834]: I1126 12:59:27.865837 4834 generic.go:334] "Generic (PLEG): container finished" podID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerID="78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97" exitCode=0 Nov 26 12:59:27 crc kubenswrapper[4834]: I1126 12:59:27.865946 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqsph" event={"ID":"ab9ab591-ebd6-478f-8f45-2278a8d1f539","Type":"ContainerDied","Data":"78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97"} Nov 26 12:59:27 crc kubenswrapper[4834]: I1126 12:59:27.866169 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqsph" event={"ID":"ab9ab591-ebd6-478f-8f45-2278a8d1f539","Type":"ContainerStarted","Data":"ce4d1b6eb6a50871732690ce9bf7c2f93389de718b715ba4614e8750fc57438b"} Nov 26 12:59:28 crc kubenswrapper[4834]: I1126 12:59:28.896128 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqsph" event={"ID":"ab9ab591-ebd6-478f-8f45-2278a8d1f539","Type":"ContainerStarted","Data":"a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f"} Nov 26 12:59:30 crc kubenswrapper[4834]: I1126 12:59:30.490474 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:30 crc kubenswrapper[4834]: I1126 12:59:30.490955 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:30 crc kubenswrapper[4834]: I1126 12:59:30.542861 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:30 crc kubenswrapper[4834]: I1126 12:59:30.913501 4834 generic.go:334] "Generic (PLEG): container finished" podID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerID="a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f" exitCode=0 Nov 26 12:59:30 crc kubenswrapper[4834]: I1126 12:59:30.913581 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqsph" event={"ID":"ab9ab591-ebd6-478f-8f45-2278a8d1f539","Type":"ContainerDied","Data":"a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f"} Nov 26 12:59:30 crc kubenswrapper[4834]: I1126 12:59:30.954873 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:31 crc kubenswrapper[4834]: I1126 12:59:31.921449 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqsph" event={"ID":"ab9ab591-ebd6-478f-8f45-2278a8d1f539","Type":"ContainerStarted","Data":"4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81"} Nov 26 12:59:31 crc kubenswrapper[4834]: I1126 12:59:31.931597 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f2bcg"] Nov 26 12:59:31 crc kubenswrapper[4834]: I1126 12:59:31.935148 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qqsph" podStartSLOduration=2.37912684 podStartE2EDuration="5.935131384s" podCreationTimestamp="2025-11-26 12:59:26 +0000 UTC" firstStartedPulling="2025-11-26 12:59:27.867127884 +0000 UTC m=+2865.774341236" lastFinishedPulling="2025-11-26 12:59:31.423132428 +0000 UTC m=+2869.330345780" observedRunningTime="2025-11-26 12:59:31.933887679 +0000 UTC m=+2869.841101031" watchObservedRunningTime="2025-11-26 12:59:31.935131384 +0000 UTC m=+2869.842344726" Nov 26 12:59:32 crc kubenswrapper[4834]: I1126 12:59:32.932686 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f2bcg" podUID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerName="registry-server" containerID="cri-o://7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b" gracePeriod=2 Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.321697 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.334045 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-catalog-content\") pod \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.334088 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-utilities\") pod \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.334136 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln4gh\" (UniqueName: \"kubernetes.io/projected/a0dcde94-18e0-4b56-9177-fb652b9d16f5-kube-api-access-ln4gh\") pod \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\" (UID: \"a0dcde94-18e0-4b56-9177-fb652b9d16f5\") " Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.334794 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-utilities" (OuterVolumeSpecName: "utilities") pod "a0dcde94-18e0-4b56-9177-fb652b9d16f5" (UID: "a0dcde94-18e0-4b56-9177-fb652b9d16f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.339818 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0dcde94-18e0-4b56-9177-fb652b9d16f5-kube-api-access-ln4gh" (OuterVolumeSpecName: "kube-api-access-ln4gh") pod "a0dcde94-18e0-4b56-9177-fb652b9d16f5" (UID: "a0dcde94-18e0-4b56-9177-fb652b9d16f5"). InnerVolumeSpecName "kube-api-access-ln4gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.401581 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0dcde94-18e0-4b56-9177-fb652b9d16f5" (UID: "a0dcde94-18e0-4b56-9177-fb652b9d16f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.436149 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.436194 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0dcde94-18e0-4b56-9177-fb652b9d16f5-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.436205 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln4gh\" (UniqueName: \"kubernetes.io/projected/a0dcde94-18e0-4b56-9177-fb652b9d16f5-kube-api-access-ln4gh\") on node \"crc\" DevicePath \"\"" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.791615 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf_7fa1d84c-4b0c-45bf-881d-01d0a25ea746/util/0.log" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.944261 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf_7fa1d84c-4b0c-45bf-881d-01d0a25ea746/util/0.log" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.946010 4834 generic.go:334] "Generic (PLEG): container finished" podID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerID="7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b" exitCode=0 Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.946062 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2bcg" event={"ID":"a0dcde94-18e0-4b56-9177-fb652b9d16f5","Type":"ContainerDied","Data":"7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b"} Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.946077 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2bcg" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.946098 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2bcg" event={"ID":"a0dcde94-18e0-4b56-9177-fb652b9d16f5","Type":"ContainerDied","Data":"430eb252c11a99bee015fb307177377d2adf5c431e7b0fbb49a38f8703c743f6"} Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.946119 4834 scope.go:117] "RemoveContainer" containerID="7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.952613 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf_7fa1d84c-4b0c-45bf-881d-01d0a25ea746/pull/0.log" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.955738 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf_7fa1d84c-4b0c-45bf-881d-01d0a25ea746/pull/0.log" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.968106 4834 scope.go:117] "RemoveContainer" containerID="e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25" Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.971937 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f2bcg"] Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.977534 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f2bcg"] Nov 26 12:59:33 crc kubenswrapper[4834]: I1126 12:59:33.993717 4834 scope.go:117] "RemoveContainer" containerID="c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.030958 4834 scope.go:117] "RemoveContainer" containerID="7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b" Nov 26 12:59:34 crc kubenswrapper[4834]: E1126 12:59:34.031327 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b\": container with ID starting with 7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b not found: ID does not exist" containerID="7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.031366 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b"} err="failed to get container status \"7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b\": rpc error: code = NotFound desc = could not find container \"7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b\": container with ID starting with 7727e1ae5266d151041e0c589bfbd03d2b98bf46b9473bed4013c94b5bd9f56b not found: ID does not exist" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.031392 4834 scope.go:117] "RemoveContainer" containerID="e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25" Nov 26 12:59:34 crc kubenswrapper[4834]: E1126 12:59:34.031583 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25\": container with ID starting with e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25 not found: ID does not exist" containerID="e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.031603 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25"} err="failed to get container status \"e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25\": rpc error: code = NotFound desc = could not find container \"e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25\": container with ID starting with e68e140c4430d237322f7f401db4cc57ab80c41e6177aac5efbebe6085b3bd25 not found: ID does not exist" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.031616 4834 scope.go:117] "RemoveContainer" containerID="c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f" Nov 26 12:59:34 crc kubenswrapper[4834]: E1126 12:59:34.032444 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f\": container with ID starting with c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f not found: ID does not exist" containerID="c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.032470 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f"} err="failed to get container status \"c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f\": rpc error: code = NotFound desc = could not find container \"c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f\": container with ID starting with c247ad46269c6884ff403e1bd62d8253913f2e426ab816f23bcc1f934478859f not found: ID does not exist" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.168983 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf_7fa1d84c-4b0c-45bf-881d-01d0a25ea746/extract/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.192519 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf_7fa1d84c-4b0c-45bf-881d-01d0a25ea746/pull/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.194803 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3711d63ee32771f989196ada809b479d54ff48bb9ee48b91ea976dcc5akg8kf_7fa1d84c-4b0c-45bf-881d-01d0a25ea746/util/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.332287 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-c5rjq_90b33e2e-c7dc-4e9e-b479-dca5251277bc/kube-rbac-proxy/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.361254 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-c5rjq_90b33e2e-c7dc-4e9e-b479-dca5251277bc/manager/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.379720 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-m4bdb_d85e5da5-d129-4904-8bde-6ff4bb92614f/kube-rbac-proxy/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.426231 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" path="/var/lib/kubelet/pods/a0dcde94-18e0-4b56-9177-fb652b9d16f5/volumes" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.526204 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-7vd97_702bd8a5-fc1e-4ee9-b85b-01ea9d177a97/kube-rbac-proxy/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.537252 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-7vd97_702bd8a5-fc1e-4ee9-b85b-01ea9d177a97/manager/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.546768 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-m4bdb_d85e5da5-d129-4904-8bde-6ff4bb92614f/manager/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.673451 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-st8n9_c476edc2-bbe3-4dca-a1fc-9a9c95f758c3/kube-rbac-proxy/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.730205 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-st8n9_c476edc2-bbe3-4dca-a1fc-9a9c95f758c3/manager/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.857767 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-zbptq_7b521baa-5390-41d7-8654-7b556346833d/kube-rbac-proxy/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.866570 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-zbptq_7b521baa-5390-41d7-8654-7b556346833d/manager/0.log" Nov 26 12:59:34 crc kubenswrapper[4834]: I1126 12:59:34.982762 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-h4t44_dc55c2f3-1e8f-48ac-9d6d-581737e07566/kube-rbac-proxy/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.042822 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-h4t44_dc55c2f3-1e8f-48ac-9d6d-581737e07566/manager/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.068591 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-btq96_7f07c17d-8260-47a7-b1e1-0f16226838a7/kube-rbac-proxy/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.209477 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-xf2w7_2b31b20d-186a-4fb2-bfa2-914e5eda233e/kube-rbac-proxy/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.214979 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-xf2w7_2b31b20d-186a-4fb2-bfa2-914e5eda233e/manager/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.312697 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-btq96_7f07c17d-8260-47a7-b1e1-0f16226838a7/manager/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.432985 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-2gfhc_c6c80d48-ce10-48bd-8cfb-67db8079dc1b/kube-rbac-proxy/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.455292 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-2gfhc_c6c80d48-ce10-48bd-8cfb-67db8079dc1b/manager/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.592887 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-ffwjp_a6b8f0bf-a405-4b2a-91b2-1934cd2997b2/kube-rbac-proxy/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.639229 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-ffwjp_a6b8f0bf-a405-4b2a-91b2-1934cd2997b2/manager/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.695210 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-6phws_8911dae6-36bd-410e-847a-c7c7134bb5a4/kube-rbac-proxy/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.759982 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-6phws_8911dae6-36bd-410e-847a-c7c7134bb5a4/manager/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.799528 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-7nh8j_3811206c-bdee-4ea2-9f7c-7be96426f677/kube-rbac-proxy/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.898231 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-7nh8j_3811206c-bdee-4ea2-9f7c-7be96426f677/manager/0.log" Nov 26 12:59:35 crc kubenswrapper[4834]: I1126 12:59:35.933973 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-n4xj9_14c2b899-bac2-43dd-844e-a66f4d75954a/kube-rbac-proxy/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.067250 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-n4xj9_14c2b899-bac2-43dd-844e-a66f4d75954a/manager/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.093854 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-kcp4x_657e5afd-3aba-4acf-a85d-36ef32e8c5f8/kube-rbac-proxy/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.178169 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-kcp4x_657e5afd-3aba-4acf-a85d-36ef32e8c5f8/manager/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.282432 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-674cb676c8z4lrf_e37121ad-deca-4553-ae91-2a61eb0f9aac/kube-rbac-proxy/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.293205 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-674cb676c8z4lrf_e37121ad-deca-4553-ae91-2a61eb0f9aac/manager/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.417319 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 12:59:36 crc kubenswrapper[4834]: E1126 12:59:36.417554 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.512804 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-598cfbdbc-hvk9z_8267804e-bacd-49ad-a0b2-0168e5d6be37/operator/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.710523 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-x8kqp_bf2e56f9-16d2-4807-9650-87625dcfacd2/registry-server/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.734965 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-5s84k_796eea14-80b6-43fc-b682-1fdaf61253ee/kube-rbac-proxy/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.863126 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.863627 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.908286 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.949103 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-ppdpl_39e5784b-2de7-45cb-9741-a0840599fb52/kube-rbac-proxy/0.log" Nov 26 12:59:36 crc kubenswrapper[4834]: I1126 12:59:36.963404 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-5s84k_796eea14-80b6-43fc-b682-1fdaf61253ee/manager/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.006565 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.146882 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-qmpxl_d4c6fd9f-cee0-4f73-8a60-214b4c99e0f9/operator/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.165414 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-ppdpl_39e5784b-2de7-45cb-9741-a0840599fb52/manager/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.346878 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-jtv6r_c9fbb19b-a6b3-47d2-b5c8-6574fa71c069/manager/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.380726 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-jtv6r_c9fbb19b-a6b3-47d2-b5c8-6574fa71c069/kube-rbac-proxy/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.488624 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-wz75v_6ab4e014-587f-483e-83dd-4e0327e17828/kube-rbac-proxy/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.529736 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-659d75f7c6-78m9j_de8f1d07-0e85-4245-a378-51c81152ef64/manager/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.602971 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-wz75v_6ab4e014-587f-483e-83dd-4e0327e17828/manager/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.633680 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-7s4pw_ddae14f0-dc74-43ea-937c-144dd9ecdb62/kube-rbac-proxy/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.702838 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-7s4pw_ddae14f0-dc74-43ea-937c-144dd9ecdb62/manager/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.756042 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6f76cf5bc4-m745d_8c0416e2-355c-4b19-b275-10d029919025/kube-rbac-proxy/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.869048 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-dfcmd_52aee873-a614-4018-aa3d-beb4021c29f6/kube-rbac-proxy/0.log" Nov 26 12:59:37 crc kubenswrapper[4834]: I1126 12:59:37.870714 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-dfcmd_52aee873-a614-4018-aa3d-beb4021c29f6/manager/0.log" Nov 26 12:59:38 crc kubenswrapper[4834]: I1126 12:59:38.132530 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qqsph"] Nov 26 12:59:39 crc kubenswrapper[4834]: I1126 12:59:39.993988 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qqsph" podUID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerName="registry-server" containerID="cri-o://4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81" gracePeriod=2 Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.391489 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.580774 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-catalog-content\") pod \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.580839 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-utilities\") pod \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.580881 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm7n6\" (UniqueName: \"kubernetes.io/projected/ab9ab591-ebd6-478f-8f45-2278a8d1f539-kube-api-access-dm7n6\") pod \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\" (UID: \"ab9ab591-ebd6-478f-8f45-2278a8d1f539\") " Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.582832 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-utilities" (OuterVolumeSpecName: "utilities") pod "ab9ab591-ebd6-478f-8f45-2278a8d1f539" (UID: "ab9ab591-ebd6-478f-8f45-2278a8d1f539"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.587821 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9ab591-ebd6-478f-8f45-2278a8d1f539-kube-api-access-dm7n6" (OuterVolumeSpecName: "kube-api-access-dm7n6") pod "ab9ab591-ebd6-478f-8f45-2278a8d1f539" (UID: "ab9ab591-ebd6-478f-8f45-2278a8d1f539"). InnerVolumeSpecName "kube-api-access-dm7n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.615041 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab9ab591-ebd6-478f-8f45-2278a8d1f539" (UID: "ab9ab591-ebd6-478f-8f45-2278a8d1f539"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.683500 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.683537 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab591-ebd6-478f-8f45-2278a8d1f539-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 12:59:40 crc kubenswrapper[4834]: I1126 12:59:40.683548 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm7n6\" (UniqueName: \"kubernetes.io/projected/ab9ab591-ebd6-478f-8f45-2278a8d1f539-kube-api-access-dm7n6\") on node \"crc\" DevicePath \"\"" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.003348 4834 generic.go:334] "Generic (PLEG): container finished" podID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerID="4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81" exitCode=0 Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.003398 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqsph" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.003404 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqsph" event={"ID":"ab9ab591-ebd6-478f-8f45-2278a8d1f539","Type":"ContainerDied","Data":"4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81"} Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.003535 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqsph" event={"ID":"ab9ab591-ebd6-478f-8f45-2278a8d1f539","Type":"ContainerDied","Data":"ce4d1b6eb6a50871732690ce9bf7c2f93389de718b715ba4614e8750fc57438b"} Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.003558 4834 scope.go:117] "RemoveContainer" containerID="4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.022874 4834 scope.go:117] "RemoveContainer" containerID="a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.033662 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qqsph"] Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.041836 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qqsph"] Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.054804 4834 scope.go:117] "RemoveContainer" containerID="78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.077658 4834 scope.go:117] "RemoveContainer" containerID="4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81" Nov 26 12:59:41 crc kubenswrapper[4834]: E1126 12:59:41.078644 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81\": container with ID starting with 4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81 not found: ID does not exist" containerID="4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.078692 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81"} err="failed to get container status \"4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81\": rpc error: code = NotFound desc = could not find container \"4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81\": container with ID starting with 4801d5bd8b0568aa58fdeee41765fe07d201620cb2f2ee1b1df8e954df6c3a81 not found: ID does not exist" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.078729 4834 scope.go:117] "RemoveContainer" containerID="a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f" Nov 26 12:59:41 crc kubenswrapper[4834]: E1126 12:59:41.079059 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f\": container with ID starting with a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f not found: ID does not exist" containerID="a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.079111 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f"} err="failed to get container status \"a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f\": rpc error: code = NotFound desc = could not find container \"a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f\": container with ID starting with a8da5e8a3c1e90f96f4973b43e5b6466756f798fb169e263da0e1b054df5d90f not found: ID does not exist" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.079145 4834 scope.go:117] "RemoveContainer" containerID="78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97" Nov 26 12:59:41 crc kubenswrapper[4834]: E1126 12:59:41.079483 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97\": container with ID starting with 78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97 not found: ID does not exist" containerID="78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97" Nov 26 12:59:41 crc kubenswrapper[4834]: I1126 12:59:41.079511 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97"} err="failed to get container status \"78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97\": rpc error: code = NotFound desc = could not find container \"78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97\": container with ID starting with 78926921ee1c3dc2915effe39b76151182e8c56372166e4d179e1a5b08f92f97 not found: ID does not exist" Nov 26 12:59:42 crc kubenswrapper[4834]: I1126 12:59:42.435465 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" path="/var/lib/kubelet/pods/ab9ab591-ebd6-478f-8f45-2278a8d1f539/volumes" Nov 26 12:59:49 crc kubenswrapper[4834]: I1126 12:59:49.418427 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 12:59:49 crc kubenswrapper[4834]: E1126 12:59:49.419740 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 12:59:53 crc kubenswrapper[4834]: I1126 12:59:53.165816 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2dhk4_2d5dd7f9-7abf-4c5e-ad19-96e010400267/control-plane-machine-set-operator/0.log" Nov 26 12:59:53 crc kubenswrapper[4834]: I1126 12:59:53.326948 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jztgs_0d700278-027d-44e8-b8e3-c262fc0a6b44/kube-rbac-proxy/0.log" Nov 26 12:59:53 crc kubenswrapper[4834]: I1126 12:59:53.336813 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jztgs_0d700278-027d-44e8-b8e3-c262fc0a6b44/machine-api-operator/0.log" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.139199 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg"] Nov 26 13:00:00 crc kubenswrapper[4834]: E1126 13:00:00.140729 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerName="extract-content" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.140752 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerName="extract-content" Nov 26 13:00:00 crc kubenswrapper[4834]: E1126 13:00:00.140780 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerName="extract-utilities" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.140787 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerName="extract-utilities" Nov 26 13:00:00 crc kubenswrapper[4834]: E1126 13:00:00.140813 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerName="registry-server" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.140821 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerName="registry-server" Nov 26 13:00:00 crc kubenswrapper[4834]: E1126 13:00:00.140829 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerName="extract-content" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.140835 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerName="extract-content" Nov 26 13:00:00 crc kubenswrapper[4834]: E1126 13:00:00.140848 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerName="registry-server" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.140855 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerName="registry-server" Nov 26 13:00:00 crc kubenswrapper[4834]: E1126 13:00:00.140863 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerName="extract-utilities" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.140869 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerName="extract-utilities" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.141108 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0dcde94-18e0-4b56-9177-fb652b9d16f5" containerName="registry-server" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.141124 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9ab591-ebd6-478f-8f45-2278a8d1f539" containerName="registry-server" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.142336 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.149165 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.149487 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg"] Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.149650 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.169990 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjf9f\" (UniqueName: \"kubernetes.io/projected/46a316f6-6887-4aea-9f0f-b4298b96d50f-kube-api-access-rjf9f\") pod \"collect-profiles-29402700-wngfg\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.170053 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46a316f6-6887-4aea-9f0f-b4298b96d50f-secret-volume\") pod \"collect-profiles-29402700-wngfg\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.170261 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46a316f6-6887-4aea-9f0f-b4298b96d50f-config-volume\") pod \"collect-profiles-29402700-wngfg\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.272065 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjf9f\" (UniqueName: \"kubernetes.io/projected/46a316f6-6887-4aea-9f0f-b4298b96d50f-kube-api-access-rjf9f\") pod \"collect-profiles-29402700-wngfg\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.272111 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46a316f6-6887-4aea-9f0f-b4298b96d50f-secret-volume\") pod \"collect-profiles-29402700-wngfg\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.272153 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46a316f6-6887-4aea-9f0f-b4298b96d50f-config-volume\") pod \"collect-profiles-29402700-wngfg\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.273194 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46a316f6-6887-4aea-9f0f-b4298b96d50f-config-volume\") pod \"collect-profiles-29402700-wngfg\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.283250 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46a316f6-6887-4aea-9f0f-b4298b96d50f-secret-volume\") pod \"collect-profiles-29402700-wngfg\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.302697 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjf9f\" (UniqueName: \"kubernetes.io/projected/46a316f6-6887-4aea-9f0f-b4298b96d50f-kube-api-access-rjf9f\") pod \"collect-profiles-29402700-wngfg\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.421100 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:00:00 crc kubenswrapper[4834]: E1126 13:00:00.421441 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.457613 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:00 crc kubenswrapper[4834]: I1126 13:00:00.871260 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg"] Nov 26 13:00:01 crc kubenswrapper[4834]: I1126 13:00:01.165232 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" event={"ID":"46a316f6-6887-4aea-9f0f-b4298b96d50f","Type":"ContainerStarted","Data":"914651fde2f4ee8699b572107927059674ca884b96a5f65b52c0bc3c3b27a676"} Nov 26 13:00:01 crc kubenswrapper[4834]: I1126 13:00:01.165567 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" event={"ID":"46a316f6-6887-4aea-9f0f-b4298b96d50f","Type":"ContainerStarted","Data":"029d18685e2980379e226acbad009f285d633f9ed755604946b0e5ed544318a4"} Nov 26 13:00:01 crc kubenswrapper[4834]: I1126 13:00:01.182470 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" podStartSLOduration=1.182455917 podStartE2EDuration="1.182455917s" podCreationTimestamp="2025-11-26 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:00:01.182035855 +0000 UTC m=+2899.089249208" watchObservedRunningTime="2025-11-26 13:00:01.182455917 +0000 UTC m=+2899.089669269" Nov 26 13:00:02 crc kubenswrapper[4834]: I1126 13:00:02.184040 4834 generic.go:334] "Generic (PLEG): container finished" podID="46a316f6-6887-4aea-9f0f-b4298b96d50f" containerID="914651fde2f4ee8699b572107927059674ca884b96a5f65b52c0bc3c3b27a676" exitCode=0 Nov 26 13:00:02 crc kubenswrapper[4834]: I1126 13:00:02.184147 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" event={"ID":"46a316f6-6887-4aea-9f0f-b4298b96d50f","Type":"ContainerDied","Data":"914651fde2f4ee8699b572107927059674ca884b96a5f65b52c0bc3c3b27a676"} Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.414145 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-6cl2q_576cb695-a9de-4c83-a7a5-0727a9b6899d/cert-manager-controller/0.log" Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.497546 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.575727 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-j8kwl_d1a808a5-cb4c-4edf-b738-cc825cc68a1a/cert-manager-cainjector/0.log" Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.644023 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjf9f\" (UniqueName: \"kubernetes.io/projected/46a316f6-6887-4aea-9f0f-b4298b96d50f-kube-api-access-rjf9f\") pod \"46a316f6-6887-4aea-9f0f-b4298b96d50f\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.644171 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46a316f6-6887-4aea-9f0f-b4298b96d50f-config-volume\") pod \"46a316f6-6887-4aea-9f0f-b4298b96d50f\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.644259 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46a316f6-6887-4aea-9f0f-b4298b96d50f-secret-volume\") pod \"46a316f6-6887-4aea-9f0f-b4298b96d50f\" (UID: \"46a316f6-6887-4aea-9f0f-b4298b96d50f\") " Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.644780 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a316f6-6887-4aea-9f0f-b4298b96d50f-config-volume" (OuterVolumeSpecName: "config-volume") pod "46a316f6-6887-4aea-9f0f-b4298b96d50f" (UID: "46a316f6-6887-4aea-9f0f-b4298b96d50f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.650119 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a316f6-6887-4aea-9f0f-b4298b96d50f-kube-api-access-rjf9f" (OuterVolumeSpecName: "kube-api-access-rjf9f") pod "46a316f6-6887-4aea-9f0f-b4298b96d50f" (UID: "46a316f6-6887-4aea-9f0f-b4298b96d50f"). InnerVolumeSpecName "kube-api-access-rjf9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.650206 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a316f6-6887-4aea-9f0f-b4298b96d50f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "46a316f6-6887-4aea-9f0f-b4298b96d50f" (UID: "46a316f6-6887-4aea-9f0f-b4298b96d50f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.670753 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-ql4fg_a2c348a1-3019-4a84-bd2a-416c7be748ea/cert-manager-webhook/0.log" Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.746891 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46a316f6-6887-4aea-9f0f-b4298b96d50f-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.747080 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjf9f\" (UniqueName: \"kubernetes.io/projected/46a316f6-6887-4aea-9f0f-b4298b96d50f-kube-api-access-rjf9f\") on node \"crc\" DevicePath \"\"" Nov 26 13:00:03 crc kubenswrapper[4834]: I1126 13:00:03.747135 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46a316f6-6887-4aea-9f0f-b4298b96d50f-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 13:00:04 crc kubenswrapper[4834]: I1126 13:00:04.197974 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" event={"ID":"46a316f6-6887-4aea-9f0f-b4298b96d50f","Type":"ContainerDied","Data":"029d18685e2980379e226acbad009f285d633f9ed755604946b0e5ed544318a4"} Nov 26 13:00:04 crc kubenswrapper[4834]: I1126 13:00:04.198015 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="029d18685e2980379e226acbad009f285d633f9ed755604946b0e5ed544318a4" Nov 26 13:00:04 crc kubenswrapper[4834]: I1126 13:00:04.198040 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402700-wngfg" Nov 26 13:00:04 crc kubenswrapper[4834]: I1126 13:00:04.242589 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj"] Nov 26 13:00:04 crc kubenswrapper[4834]: I1126 13:00:04.248378 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402655-trzcj"] Nov 26 13:00:04 crc kubenswrapper[4834]: I1126 13:00:04.426553 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25328d2d-3fa8-4a87-bc53-dc9088802bbf" path="/var/lib/kubelet/pods/25328d2d-3fa8-4a87-bc53-dc9088802bbf/volumes" Nov 26 13:00:11 crc kubenswrapper[4834]: I1126 13:00:11.417217 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:00:11 crc kubenswrapper[4834]: E1126 13:00:11.417948 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:00:12 crc kubenswrapper[4834]: I1126 13:00:12.675889 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-nrvxw_1b9a0efe-1ec3-4a49-b612-a4dd462f9b9f/nmstate-console-plugin/0.log" Nov 26 13:00:12 crc kubenswrapper[4834]: I1126 13:00:12.788327 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mxvrw_dcb6e9f8-dc59-4d0b-ba58-67b66d54fc61/nmstate-handler/0.log" Nov 26 13:00:12 crc kubenswrapper[4834]: I1126 13:00:12.823684 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-8847n_28c26289-ba85-46ce-b757-883b0ab3db27/kube-rbac-proxy/0.log" Nov 26 13:00:12 crc kubenswrapper[4834]: I1126 13:00:12.857003 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-8847n_28c26289-ba85-46ce-b757-883b0ab3db27/nmstate-metrics/0.log" Nov 26 13:00:12 crc kubenswrapper[4834]: I1126 13:00:12.971522 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-xhzg5_0783fd9d-f205-4dee-87a3-be44ae70102a/nmstate-operator/0.log" Nov 26 13:00:13 crc kubenswrapper[4834]: I1126 13:00:13.007301 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-p4hf7_62e4d8b8-503c-4ce0-b534-95d90fae4c76/nmstate-webhook/0.log" Nov 26 13:00:22 crc kubenswrapper[4834]: I1126 13:00:22.972882 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-fsv98_a96c765a-d06b-44f4-9c21-41abc86dfa7c/kube-rbac-proxy/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.089953 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-fsv98_a96c765a-d06b-44f4-9c21-41abc86dfa7c/controller/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.199563 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-frr-files/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.316808 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-reloader/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.317609 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-metrics/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.322350 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-frr-files/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.347271 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-reloader/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.513147 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-frr-files/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.527484 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-metrics/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.530910 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-reloader/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.551417 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-metrics/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.728409 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-reloader/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.885496 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-metrics/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.889478 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/controller/0.log" Nov 26 13:00:23 crc kubenswrapper[4834]: I1126 13:00:23.906452 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/cp-frr-files/0.log" Nov 26 13:00:24 crc kubenswrapper[4834]: I1126 13:00:24.030780 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/frr-metrics/0.log" Nov 26 13:00:24 crc kubenswrapper[4834]: I1126 13:00:24.085627 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/kube-rbac-proxy-frr/0.log" Nov 26 13:00:24 crc kubenswrapper[4834]: I1126 13:00:24.101356 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/kube-rbac-proxy/0.log" Nov 26 13:00:24 crc kubenswrapper[4834]: I1126 13:00:24.219675 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/reloader/0.log" Nov 26 13:00:24 crc kubenswrapper[4834]: I1126 13:00:24.283155 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-jwdtf_18f9e616-aa50-4bbb-adf2-d635f4ed3b0b/frr-k8s-webhook-server/0.log" Nov 26 13:00:24 crc kubenswrapper[4834]: I1126 13:00:24.417984 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:00:24 crc kubenswrapper[4834]: E1126 13:00:24.418225 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:00:24 crc kubenswrapper[4834]: I1126 13:00:24.446476 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-558846f6-bdb45_7038dc0b-b755-4c32-871a-27327898c558/manager/0.log" Nov 26 13:00:24 crc kubenswrapper[4834]: I1126 13:00:24.627789 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-884765884-jmk72_a8711e7a-0128-4160-9be7-6b25ccdf2ea6/webhook-server/0.log" Nov 26 13:00:24 crc kubenswrapper[4834]: I1126 13:00:24.699725 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gkhb7_cd4f4222-097e-4ec4-840c-8acd707eb05c/kube-rbac-proxy/0.log" Nov 26 13:00:25 crc kubenswrapper[4834]: I1126 13:00:25.174200 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gkhb7_cd4f4222-097e-4ec4-840c-8acd707eb05c/speaker/0.log" Nov 26 13:00:25 crc kubenswrapper[4834]: I1126 13:00:25.234820 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9bcr6_08a409b5-356f-4044-b624-bc95b687b192/frr/0.log" Nov 26 13:00:34 crc kubenswrapper[4834]: I1126 13:00:34.543680 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42_c81bac3a-9953-4132-9869-f1f12d228844/util/0.log" Nov 26 13:00:34 crc kubenswrapper[4834]: I1126 13:00:34.692174 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42_c81bac3a-9953-4132-9869-f1f12d228844/util/0.log" Nov 26 13:00:34 crc kubenswrapper[4834]: I1126 13:00:34.720080 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42_c81bac3a-9953-4132-9869-f1f12d228844/pull/0.log" Nov 26 13:00:34 crc kubenswrapper[4834]: I1126 13:00:34.766881 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42_c81bac3a-9953-4132-9869-f1f12d228844/pull/0.log" Nov 26 13:00:34 crc kubenswrapper[4834]: I1126 13:00:34.888969 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42_c81bac3a-9953-4132-9869-f1f12d228844/util/0.log" Nov 26 13:00:34 crc kubenswrapper[4834]: I1126 13:00:34.907444 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42_c81bac3a-9953-4132-9869-f1f12d228844/pull/0.log" Nov 26 13:00:34 crc kubenswrapper[4834]: I1126 13:00:34.912032 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772engg42_c81bac3a-9953-4132-9869-f1f12d228844/extract/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.137127 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-99nqk_1816ee93-0f51-41b5-9763-03a43aa4b6a7/extract-utilities/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.278439 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-99nqk_1816ee93-0f51-41b5-9763-03a43aa4b6a7/extract-utilities/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.281118 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-99nqk_1816ee93-0f51-41b5-9763-03a43aa4b6a7/extract-content/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.333892 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-99nqk_1816ee93-0f51-41b5-9763-03a43aa4b6a7/extract-content/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.497112 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-99nqk_1816ee93-0f51-41b5-9763-03a43aa4b6a7/extract-content/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.505730 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-99nqk_1816ee93-0f51-41b5-9763-03a43aa4b6a7/extract-utilities/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.654717 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l9v6t_0d5b15f7-54f3-4bde-b304-4e803caf4309/extract-utilities/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.852414 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-99nqk_1816ee93-0f51-41b5-9763-03a43aa4b6a7/registry-server/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.877048 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l9v6t_0d5b15f7-54f3-4bde-b304-4e803caf4309/extract-content/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.915417 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l9v6t_0d5b15f7-54f3-4bde-b304-4e803caf4309/extract-utilities/0.log" Nov 26 13:00:35 crc kubenswrapper[4834]: I1126 13:00:35.953711 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l9v6t_0d5b15f7-54f3-4bde-b304-4e803caf4309/extract-content/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.057972 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l9v6t_0d5b15f7-54f3-4bde-b304-4e803caf4309/extract-utilities/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.062725 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l9v6t_0d5b15f7-54f3-4bde-b304-4e803caf4309/extract-content/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.273392 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544_1c0edd6e-551f-4a07-9201-bc4b9d13a796/util/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.421821 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:00:36 crc kubenswrapper[4834]: E1126 13:00:36.422073 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.461635 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l9v6t_0d5b15f7-54f3-4bde-b304-4e803caf4309/registry-server/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.481174 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544_1c0edd6e-551f-4a07-9201-bc4b9d13a796/util/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.491814 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544_1c0edd6e-551f-4a07-9201-bc4b9d13a796/pull/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.494293 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544_1c0edd6e-551f-4a07-9201-bc4b9d13a796/pull/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.617919 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544_1c0edd6e-551f-4a07-9201-bc4b9d13a796/pull/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.629180 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544_1c0edd6e-551f-4a07-9201-bc4b9d13a796/util/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.657961 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6w5544_1c0edd6e-551f-4a07-9201-bc4b9d13a796/extract/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.762932 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gv52n_c7e72b81-7b18-4afb-ad1e-818ff77aaf27/marketplace-operator/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.802862 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt9bt_073c7670-7cb5-4160-b6f2-d301f594dd00/extract-utilities/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.965701 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt9bt_073c7670-7cb5-4160-b6f2-d301f594dd00/extract-content/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.968626 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt9bt_073c7670-7cb5-4160-b6f2-d301f594dd00/extract-utilities/0.log" Nov 26 13:00:36 crc kubenswrapper[4834]: I1126 13:00:36.975531 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt9bt_073c7670-7cb5-4160-b6f2-d301f594dd00/extract-content/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.127420 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt9bt_073c7670-7cb5-4160-b6f2-d301f594dd00/extract-content/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.133050 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt9bt_073c7670-7cb5-4160-b6f2-d301f594dd00/extract-utilities/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.210804 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt9bt_073c7670-7cb5-4160-b6f2-d301f594dd00/registry-server/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.285670 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nb8ks_923697b4-3ab4-4e51-8f10-501c3c2cdff6/extract-utilities/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.372740 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nb8ks_923697b4-3ab4-4e51-8f10-501c3c2cdff6/extract-utilities/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.393274 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nb8ks_923697b4-3ab4-4e51-8f10-501c3c2cdff6/extract-content/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.427059 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nb8ks_923697b4-3ab4-4e51-8f10-501c3c2cdff6/extract-content/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.567527 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nb8ks_923697b4-3ab4-4e51-8f10-501c3c2cdff6/extract-utilities/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.592134 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nb8ks_923697b4-3ab4-4e51-8f10-501c3c2cdff6/extract-content/0.log" Nov 26 13:00:37 crc kubenswrapper[4834]: I1126 13:00:37.852406 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nb8ks_923697b4-3ab4-4e51-8f10-501c3c2cdff6/registry-server/0.log" Nov 26 13:00:46 crc kubenswrapper[4834]: I1126 13:00:46.711995 4834 scope.go:117] "RemoveContainer" containerID="a99a0d7694cf5ce97ba92bfb1f9fe7f9ba9860a0a956a28a24fd65dd7bc355f0" Nov 26 13:00:48 crc kubenswrapper[4834]: I1126 13:00:48.416623 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:00:48 crc kubenswrapper[4834]: E1126 13:00:48.417022 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:00:58 crc kubenswrapper[4834]: E1126 13:00:58.427925 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" image="38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517" Nov 26 13:00:58 crc kubenswrapper[4834]: E1126 13:00:58.428476 4834 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" image="38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517" Nov 26 13:00:58 crc kubenswrapper[4834]: E1126 13:00:58.429266 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wm75x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6f76cf5bc4-m745d_openstack-operators(8c0416e2-355c-4b19-b275-10d029919025): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" logger="UnhandledError" Nov 26 13:00:58 crc kubenswrapper[4834]: E1126 13:00:58.430414 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \\\"http://38.102.83.98:5001/v2/\\\": dial tcp 38.102.83.98:5001: i/o timeout\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 13:00:59 crc kubenswrapper[4834]: I1126 13:00:59.418725 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:00:59 crc kubenswrapper[4834]: E1126 13:00:59.419381 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.138405 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29402701-9vcvd"] Nov 26 13:01:00 crc kubenswrapper[4834]: E1126 13:01:00.139747 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a316f6-6887-4aea-9f0f-b4298b96d50f" containerName="collect-profiles" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.139813 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a316f6-6887-4aea-9f0f-b4298b96d50f" containerName="collect-profiles" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.140048 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a316f6-6887-4aea-9f0f-b4298b96d50f" containerName="collect-profiles" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.140638 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.149247 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29402701-9vcvd"] Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.183079 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-fernet-keys\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.183168 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-config-data\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.183230 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-combined-ca-bundle\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.183286 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nbd8\" (UniqueName: \"kubernetes.io/projected/86daa8b1-0f7f-441c-b56a-c13c1603586c-kube-api-access-7nbd8\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.285110 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-fernet-keys\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.285200 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-config-data\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.285258 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-combined-ca-bundle\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.285294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nbd8\" (UniqueName: \"kubernetes.io/projected/86daa8b1-0f7f-441c-b56a-c13c1603586c-kube-api-access-7nbd8\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.291692 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-combined-ca-bundle\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.292037 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-config-data\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.294460 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-fernet-keys\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.303995 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nbd8\" (UniqueName: \"kubernetes.io/projected/86daa8b1-0f7f-441c-b56a-c13c1603586c-kube-api-access-7nbd8\") pod \"keystone-cron-29402701-9vcvd\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.456362 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:00 crc kubenswrapper[4834]: E1126 13:01:00.795041 4834 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.148:51028->192.168.26.148:32785: write tcp 192.168.26.148:51028->192.168.26.148:32785: write: broken pipe Nov 26 13:01:00 crc kubenswrapper[4834]: I1126 13:01:00.878560 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29402701-9vcvd"] Nov 26 13:01:00 crc kubenswrapper[4834]: W1126 13:01:00.891162 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86daa8b1_0f7f_441c_b56a_c13c1603586c.slice/crio-a403273bf1422ac94e08d2ff17aefc34ee9349708e3811bcd75f871b0b117983 WatchSource:0}: Error finding container a403273bf1422ac94e08d2ff17aefc34ee9349708e3811bcd75f871b0b117983: Status 404 returned error can't find the container with id a403273bf1422ac94e08d2ff17aefc34ee9349708e3811bcd75f871b0b117983 Nov 26 13:01:01 crc kubenswrapper[4834]: I1126 13:01:01.627465 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402701-9vcvd" event={"ID":"86daa8b1-0f7f-441c-b56a-c13c1603586c","Type":"ContainerStarted","Data":"86c3e24e0c96fd8bc66149be3761352e71e2aef975cd1870286c4f381ecb8280"} Nov 26 13:01:01 crc kubenswrapper[4834]: I1126 13:01:01.627908 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402701-9vcvd" event={"ID":"86daa8b1-0f7f-441c-b56a-c13c1603586c","Type":"ContainerStarted","Data":"a403273bf1422ac94e08d2ff17aefc34ee9349708e3811bcd75f871b0b117983"} Nov 26 13:01:01 crc kubenswrapper[4834]: I1126 13:01:01.648202 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29402701-9vcvd" podStartSLOduration=1.64818511 podStartE2EDuration="1.64818511s" podCreationTimestamp="2025-11-26 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:01:01.64228789 +0000 UTC m=+2959.549501241" watchObservedRunningTime="2025-11-26 13:01:01.64818511 +0000 UTC m=+2959.555398462" Nov 26 13:01:03 crc kubenswrapper[4834]: I1126 13:01:03.642346 4834 generic.go:334] "Generic (PLEG): container finished" podID="86daa8b1-0f7f-441c-b56a-c13c1603586c" containerID="86c3e24e0c96fd8bc66149be3761352e71e2aef975cd1870286c4f381ecb8280" exitCode=0 Nov 26 13:01:03 crc kubenswrapper[4834]: I1126 13:01:03.642542 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402701-9vcvd" event={"ID":"86daa8b1-0f7f-441c-b56a-c13c1603586c","Type":"ContainerDied","Data":"86c3e24e0c96fd8bc66149be3761352e71e2aef975cd1870286c4f381ecb8280"} Nov 26 13:01:04 crc kubenswrapper[4834]: I1126 13:01:04.957636 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:04 crc kubenswrapper[4834]: I1126 13:01:04.975104 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nbd8\" (UniqueName: \"kubernetes.io/projected/86daa8b1-0f7f-441c-b56a-c13c1603586c-kube-api-access-7nbd8\") pod \"86daa8b1-0f7f-441c-b56a-c13c1603586c\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " Nov 26 13:01:04 crc kubenswrapper[4834]: I1126 13:01:04.975227 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-combined-ca-bundle\") pod \"86daa8b1-0f7f-441c-b56a-c13c1603586c\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " Nov 26 13:01:04 crc kubenswrapper[4834]: I1126 13:01:04.975495 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-fernet-keys\") pod \"86daa8b1-0f7f-441c-b56a-c13c1603586c\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " Nov 26 13:01:04 crc kubenswrapper[4834]: I1126 13:01:04.975569 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-config-data\") pod \"86daa8b1-0f7f-441c-b56a-c13c1603586c\" (UID: \"86daa8b1-0f7f-441c-b56a-c13c1603586c\") " Nov 26 13:01:04 crc kubenswrapper[4834]: I1126 13:01:04.981533 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "86daa8b1-0f7f-441c-b56a-c13c1603586c" (UID: "86daa8b1-0f7f-441c-b56a-c13c1603586c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 13:01:04 crc kubenswrapper[4834]: I1126 13:01:04.981654 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86daa8b1-0f7f-441c-b56a-c13c1603586c-kube-api-access-7nbd8" (OuterVolumeSpecName: "kube-api-access-7nbd8") pod "86daa8b1-0f7f-441c-b56a-c13c1603586c" (UID: "86daa8b1-0f7f-441c-b56a-c13c1603586c"). InnerVolumeSpecName "kube-api-access-7nbd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 13:01:05 crc kubenswrapper[4834]: I1126 13:01:05.012964 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86daa8b1-0f7f-441c-b56a-c13c1603586c" (UID: "86daa8b1-0f7f-441c-b56a-c13c1603586c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 13:01:05 crc kubenswrapper[4834]: I1126 13:01:05.016622 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-config-data" (OuterVolumeSpecName: "config-data") pod "86daa8b1-0f7f-441c-b56a-c13c1603586c" (UID: "86daa8b1-0f7f-441c-b56a-c13c1603586c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 13:01:05 crc kubenswrapper[4834]: I1126 13:01:05.077277 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:01:05 crc kubenswrapper[4834]: I1126 13:01:05.077559 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nbd8\" (UniqueName: \"kubernetes.io/projected/86daa8b1-0f7f-441c-b56a-c13c1603586c-kube-api-access-7nbd8\") on node \"crc\" DevicePath \"\"" Nov 26 13:01:05 crc kubenswrapper[4834]: I1126 13:01:05.077571 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:01:05 crc kubenswrapper[4834]: I1126 13:01:05.077581 4834 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86daa8b1-0f7f-441c-b56a-c13c1603586c-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 13:01:05 crc kubenswrapper[4834]: I1126 13:01:05.669659 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402701-9vcvd" event={"ID":"86daa8b1-0f7f-441c-b56a-c13c1603586c","Type":"ContainerDied","Data":"a403273bf1422ac94e08d2ff17aefc34ee9349708e3811bcd75f871b0b117983"} Nov 26 13:01:05 crc kubenswrapper[4834]: I1126 13:01:05.669742 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a403273bf1422ac94e08d2ff17aefc34ee9349708e3811bcd75f871b0b117983" Nov 26 13:01:05 crc kubenswrapper[4834]: I1126 13:01:05.669817 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402701-9vcvd" Nov 26 13:01:11 crc kubenswrapper[4834]: I1126 13:01:11.416833 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:01:11 crc kubenswrapper[4834]: E1126 13:01:11.417684 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:01:11 crc kubenswrapper[4834]: E1126 13:01:11.418695 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 13:01:24 crc kubenswrapper[4834]: E1126 13:01:24.425539 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 13:01:26 crc kubenswrapper[4834]: I1126 13:01:26.417612 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:01:26 crc kubenswrapper[4834]: E1126 13:01:26.418088 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:01:38 crc kubenswrapper[4834]: E1126 13:01:38.419174 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 13:01:39 crc kubenswrapper[4834]: I1126 13:01:39.417115 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:01:39 crc kubenswrapper[4834]: E1126 13:01:39.417532 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:01:40 crc kubenswrapper[4834]: I1126 13:01:40.030912 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-b2qmr"] Nov 26 13:01:40 crc kubenswrapper[4834]: I1126 13:01:40.043287 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-246d-account-create-update-9jk9p"] Nov 26 13:01:40 crc kubenswrapper[4834]: I1126 13:01:40.047813 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-b2qmr"] Nov 26 13:01:40 crc kubenswrapper[4834]: I1126 13:01:40.053920 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-246d-account-create-update-9jk9p"] Nov 26 13:01:40 crc kubenswrapper[4834]: I1126 13:01:40.426424 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f2a300-39e4-4bfc-bb9a-5646fe44709c" path="/var/lib/kubelet/pods/69f2a300-39e4-4bfc-bb9a-5646fe44709c/volumes" Nov 26 13:01:40 crc kubenswrapper[4834]: I1126 13:01:40.427723 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="915f4671-4ace-4684-b095-75ee89fc9c7b" path="/var/lib/kubelet/pods/915f4671-4ace-4684-b095-75ee89fc9c7b/volumes" Nov 26 13:01:46 crc kubenswrapper[4834]: I1126 13:01:46.762820 4834 scope.go:117] "RemoveContainer" containerID="852f8b0ee55db909f7916448f36100bacb1e2af50b181a0ac9283d65b0a254ab" Nov 26 13:01:46 crc kubenswrapper[4834]: I1126 13:01:46.782793 4834 scope.go:117] "RemoveContainer" containerID="b5bfd1eadc4d4ac6a9959fbcc6c60e8fee14a95279a5d8a53268539a77c817c0" Nov 26 13:01:53 crc kubenswrapper[4834]: I1126 13:01:53.417591 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:01:53 crc kubenswrapper[4834]: E1126 13:01:53.418470 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:01:54 crc kubenswrapper[4834]: I1126 13:01:54.040762 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-4bj4f"] Nov 26 13:01:54 crc kubenswrapper[4834]: I1126 13:01:54.050291 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-4bj4f"] Nov 26 13:01:54 crc kubenswrapper[4834]: I1126 13:01:54.426213 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b67b910b-d864-4d0f-9f34-921e1cdd0517" path="/var/lib/kubelet/pods/b67b910b-d864-4d0f-9f34-921e1cdd0517/volumes" Nov 26 13:01:55 crc kubenswrapper[4834]: I1126 13:01:55.060248 4834 generic.go:334] "Generic (PLEG): container finished" podID="1a4c19b6-76ea-4977-bac4-7f1406bee595" containerID="df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559" exitCode=0 Nov 26 13:01:55 crc kubenswrapper[4834]: I1126 13:01:55.060304 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q54tt/must-gather-njj5w" event={"ID":"1a4c19b6-76ea-4977-bac4-7f1406bee595","Type":"ContainerDied","Data":"df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559"} Nov 26 13:01:55 crc kubenswrapper[4834]: I1126 13:01:55.061585 4834 scope.go:117] "RemoveContainer" containerID="df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559" Nov 26 13:01:55 crc kubenswrapper[4834]: I1126 13:01:55.453261 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q54tt_must-gather-njj5w_1a4c19b6-76ea-4977-bac4-7f1406bee595/gather/0.log" Nov 26 13:02:02 crc kubenswrapper[4834]: I1126 13:02:02.672607 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q54tt/must-gather-njj5w"] Nov 26 13:02:02 crc kubenswrapper[4834]: I1126 13:02:02.673386 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-q54tt/must-gather-njj5w" podUID="1a4c19b6-76ea-4977-bac4-7f1406bee595" containerName="copy" containerID="cri-o://16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab" gracePeriod=2 Nov 26 13:02:02 crc kubenswrapper[4834]: I1126 13:02:02.677994 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q54tt/must-gather-njj5w"] Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.017854 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q54tt_must-gather-njj5w_1a4c19b6-76ea-4977-bac4-7f1406bee595/copy/0.log" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.018565 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.090245 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcm4l\" (UniqueName: \"kubernetes.io/projected/1a4c19b6-76ea-4977-bac4-7f1406bee595-kube-api-access-vcm4l\") pod \"1a4c19b6-76ea-4977-bac4-7f1406bee595\" (UID: \"1a4c19b6-76ea-4977-bac4-7f1406bee595\") " Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.090407 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a4c19b6-76ea-4977-bac4-7f1406bee595-must-gather-output\") pod \"1a4c19b6-76ea-4977-bac4-7f1406bee595\" (UID: \"1a4c19b6-76ea-4977-bac4-7f1406bee595\") " Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.095583 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4c19b6-76ea-4977-bac4-7f1406bee595-kube-api-access-vcm4l" (OuterVolumeSpecName: "kube-api-access-vcm4l") pod "1a4c19b6-76ea-4977-bac4-7f1406bee595" (UID: "1a4c19b6-76ea-4977-bac4-7f1406bee595"). InnerVolumeSpecName "kube-api-access-vcm4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.113607 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q54tt_must-gather-njj5w_1a4c19b6-76ea-4977-bac4-7f1406bee595/copy/0.log" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.113931 4834 generic.go:334] "Generic (PLEG): container finished" podID="1a4c19b6-76ea-4977-bac4-7f1406bee595" containerID="16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab" exitCode=143 Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.113984 4834 scope.go:117] "RemoveContainer" containerID="16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.113990 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q54tt/must-gather-njj5w" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.134510 4834 scope.go:117] "RemoveContainer" containerID="df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.192120 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcm4l\" (UniqueName: \"kubernetes.io/projected/1a4c19b6-76ea-4977-bac4-7f1406bee595-kube-api-access-vcm4l\") on node \"crc\" DevicePath \"\"" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.213494 4834 scope.go:117] "RemoveContainer" containerID="16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab" Nov 26 13:02:03 crc kubenswrapper[4834]: E1126 13:02:03.225420 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab\": container with ID starting with 16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab not found: ID does not exist" containerID="16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.225468 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab"} err="failed to get container status \"16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab\": rpc error: code = NotFound desc = could not find container \"16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab\": container with ID starting with 16f011984f8aa76cfc5dc027cbffc35dbe004591d11bbbfcbd4e487be5c175ab not found: ID does not exist" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.225494 4834 scope.go:117] "RemoveContainer" containerID="df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.231248 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a4c19b6-76ea-4977-bac4-7f1406bee595-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1a4c19b6-76ea-4977-bac4-7f1406bee595" (UID: "1a4c19b6-76ea-4977-bac4-7f1406bee595"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 13:02:03 crc kubenswrapper[4834]: E1126 13:02:03.231393 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559\": container with ID starting with df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559 not found: ID does not exist" containerID="df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.231429 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559"} err="failed to get container status \"df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559\": rpc error: code = NotFound desc = could not find container \"df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559\": container with ID starting with df50b31354f172187c4fbbc30158d554d77c41329090783df14ae71ecce67559 not found: ID does not exist" Nov 26 13:02:03 crc kubenswrapper[4834]: I1126 13:02:03.293842 4834 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a4c19b6-76ea-4977-bac4-7f1406bee595-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 26 13:02:04 crc kubenswrapper[4834]: I1126 13:02:04.425295 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4c19b6-76ea-4977-bac4-7f1406bee595" path="/var/lib/kubelet/pods/1a4c19b6-76ea-4977-bac4-7f1406bee595/volumes" Nov 26 13:02:07 crc kubenswrapper[4834]: I1126 13:02:07.417162 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:02:07 crc kubenswrapper[4834]: E1126 13:02:07.420994 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:02:20 crc kubenswrapper[4834]: I1126 13:02:20.418180 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:02:20 crc kubenswrapper[4834]: E1126 13:02:20.419136 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:02:32 crc kubenswrapper[4834]: I1126 13:02:32.422592 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:02:32 crc kubenswrapper[4834]: E1126 13:02:32.423451 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:02:46 crc kubenswrapper[4834]: I1126 13:02:46.873066 4834 scope.go:117] "RemoveContainer" containerID="fe22f91d0777809dad4c61f6974fe149d36bf20729102bdb45d24af0085ebf53" Nov 26 13:02:47 crc kubenswrapper[4834]: I1126 13:02:47.417392 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:02:47 crc kubenswrapper[4834]: E1126 13:02:47.417694 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:02:58 crc kubenswrapper[4834]: I1126 13:02:58.417796 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:02:58 crc kubenswrapper[4834]: E1126 13:02:58.418538 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:03:12 crc kubenswrapper[4834]: I1126 13:03:12.422244 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:03:12 crc kubenswrapper[4834]: E1126 13:03:12.422905 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:03:27 crc kubenswrapper[4834]: I1126 13:03:27.417249 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:03:27 crc kubenswrapper[4834]: E1126 13:03:27.418129 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:03:42 crc kubenswrapper[4834]: I1126 13:03:42.422238 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:03:42 crc kubenswrapper[4834]: E1126 13:03:42.422942 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:03:50 crc kubenswrapper[4834]: E1126 13:03:50.425082 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" image="38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517" Nov 26 13:03:50 crc kubenswrapper[4834]: E1126 13:03:50.425462 4834 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" image="38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517" Nov 26 13:03:50 crc kubenswrapper[4834]: E1126 13:03:50.425586 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wm75x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6f76cf5bc4-m745d_openstack-operators(8c0416e2-355c-4b19-b275-10d029919025): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \"http://38.102.83.98:5001/v2/\": dial tcp 38.102.83.98:5001: i/o timeout" logger="UnhandledError" Nov 26 13:03:50 crc kubenswrapper[4834]: E1126 13:03:50.426741 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517: pinging container registry 38.102.83.98:5001: Get \\\"http://38.102.83.98:5001/v2/\\\": dial tcp 38.102.83.98:5001: i/o timeout\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 13:03:55 crc kubenswrapper[4834]: I1126 13:03:55.417129 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:03:55 crc kubenswrapper[4834]: E1126 13:03:55.417785 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:04:05 crc kubenswrapper[4834]: E1126 13:04:05.418728 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 13:04:10 crc kubenswrapper[4834]: I1126 13:04:10.417200 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:04:10 crc kubenswrapper[4834]: E1126 13:04:10.417810 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:04:17 crc kubenswrapper[4834]: E1126 13:04:17.418858 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 13:04:21 crc kubenswrapper[4834]: I1126 13:04:21.416959 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:04:21 crc kubenswrapper[4834]: E1126 13:04:21.417814 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzb52_openshift-machine-config-operator(b15e8745-fc1a-4575-ac07-e483f8e41c8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" podUID="b15e8745-fc1a-4575-ac07-e483f8e41c8d" Nov 26 13:04:30 crc kubenswrapper[4834]: E1126 13:04:30.419456 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 13:04:35 crc kubenswrapper[4834]: I1126 13:04:35.416982 4834 scope.go:117] "RemoveContainer" containerID="0d3d8946855afd77b226863ac610815c17b4ab4d5ef689bcc284b59da9b3c719" Nov 26 13:04:36 crc kubenswrapper[4834]: I1126 13:04:36.201912 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzb52" event={"ID":"b15e8745-fc1a-4575-ac07-e483f8e41c8d","Type":"ContainerStarted","Data":"23c1e6703d1bd51f782ea459209211aad4eba207c1240e37263dd29ddc02d369"} Nov 26 13:04:42 crc kubenswrapper[4834]: E1126 13:04:42.427082 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025" Nov 26 13:04:55 crc kubenswrapper[4834]: E1126 13:04:55.419159 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/openstack-k8s-operators/test-operator:68b77c48042be317ac8e0bb694f24ff66dc1b517\\\"\"" pod="openstack-operators/test-operator-controller-manager-6f76cf5bc4-m745d" podUID="8c0416e2-355c-4b19-b275-10d029919025"